Weighted estimation for texture analysis based on Fisher discriminative analysis Online publication date: Mon, 01-Oct-2018
by Xiaoping Jiang; Chuyu Guo; Hua Zhang; Chenghua Li
International Journal of Information and Communication Technology (IJICT), Vol. 13, No. 4, 2018
Abstract: The traditional texture analysis methods only use relative contribution of each face area to mark the global similarity. For solving the problem of feature extraction which cause by local instead of global, weighted estimation for texture analysis (WETA) method based on the Fisher discriminative analysis (FDA) is proposed. First, local binary pattern (local binary pattern, LBP) or partial phase quantisation (local phase quantisation, LPQ) is used for image texture encoding. Then, the image is divided into local small pieces which are all equal and not overlap. The most discrimination axis, which are extracted from similarity space, are applied into texture analysis by FDA method, then the best solution through weight optimisation is given. Finally, experiments on two major general face databases (FERET and FEI) verify the effectiveness of the proposed method. The proposed method receives a recognition rate of 96% in LPQ and FERET combination. The experimental results show that compared with texture methods in other papers, the proposed method in this paper has obtained better recognition performance.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Information and Communication Technology (IJICT):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com