DocumentCode
615176
Title
Facial ethnicity classification based on boosted local texture and shape descriptions
Author
Huaxiong Ding ; Di Huang ; Yunhong Wang ; Liming Chen
Author_Institution
Lab. d´Inf. en Image et Syst. d´Inf. (LIRIS), Ecole Centrale de Lyon, Lyon, France
fYear
2013
fDate
22-26 April 2013
Firstpage
1
Lastpage
6
Abstract
Ethnicity is a key demographic attribute of human beings and it plays a important role in automatic machine based face analysis, therefore, there has been increasing attention for face based ethnicity classification in recent years. In this paper, we propose a novel method on such an issue by combining both boosted local texture and shape features extracted from 3D face models, in contrast to the existing ones that only depend on 2D facial images. The proposed method makes use of the Oriented Gradient Maps (OGMs) to highlight local geometry as well as texture variations of entire faces, while further learns a compact set of features which are highly related to the ethnicity property for classification. Experiments are comprehensively carried out on the FRGC v2.0 dataset, and the performance is up to 98.3% to distinguish Asians from non-Asians when 80% samples are used in the training set, demonstrating the effectiveness of the proposed method.
Keywords
computational geometry; face recognition; feature extraction; image classification; image texture; learning (artificial intelligence); 3D face models; Asian people; FRGC v2.0 dataset; OGM; automatic machine-based face analysis; boosted local shape feature extraction; boosted local texture feature extraction; demographic attribute; face-based ethnicity classification; human beings; local geometry; nonAsian people; oriented gradient maps; texture variations; training set; Accuracy; Databases; Face; Feature extraction; Nose; Shape; Training;
fLanguage
English
Publisher
ieee
Conference_Titel
Automatic Face and Gesture Recognition (FG), 2013 10th IEEE International Conference and Workshops on
Conference_Location
Shanghai
Print_ISBN
978-1-4673-5545-2
Electronic_ISBN
978-1-4673-5544-5
Type
conf
DOI
10.1109/FG.2013.6553815
Filename
6553815
Link To Document