Author_Institution :
Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ., Hong Kong, China
Abstract :
In this paper, we propose a feature-fusion method based on Canonical Correlation Analysis (CCA) for facial-expression recognition. In our proposed method, features from the eye and the mouth windows are extracted separately, which are correlated with each other in representing a facial expression. For each of the windows, two effective features, namely the Local Phase Quantization (LPQ) and the Pyramid of Histogram of Oriented Gradients (PHOG) descriptors, are employed to form low-level representations of the corresponding windows. The features are then represented in a coherent subspace by using CCA in order to maximize the correlation. In our experiments, the Extended Cohn-Kanade dataset is used; its face images span seven different emotions, namely anger, contempt, disgust, fear, happiness, sadness, and surprise. Experiment results show that our method can achieve excellent accuracy for facial-expression recognition.
Keywords :
correlation theory; data compression; emotion recognition; face recognition; feature extraction; gradient methods; image coding; CCA; LPQ; PHOG; canonical correlation analysis; extended Cohn-Kanade dataset; face emotion; face image; facial-expression recognition; local phase quantization; pyramid of histogram of oriented gradient; region-based feature fusion method; Correlation; Face; Face recognition; Feature extraction; Histograms; Mouth; Canonical Correlation Analysis; Local Phase Quantization; Pyramid of Histogram of Oriented Gradients; facial expression recognition; feature fusion;