Title :
AU-aware Deep Networks for facial expression recognition
Author :
Mengyi Liu ; Shaoxin Li ; Shiguang Shan ; Xilin Chen
Author_Institution :
Key Lab. of Intell. Inf. Process., Inst. of Comput. Technol., Beijing, China
Abstract :
In this paper, we propose to construct a deep architecture, AU-aware Deep Networks (AUDN), for facial expression recognition by elaborately utilizing the prior knowledge that the appearance variations caused by expression can be decomposed into a batch of local facial Action Units (AUs). The proposed AUDN is composed of three sequential modules: the first module consists of two layers, i.e., a convolution layer and a max-pooling layer, which aim to generate an over-complete representation encoding all expression-specific appearance variations over all possible locations; In the second module, an AU-aware receptive field layer is designed to search subsets of the over-complete representation, each of which aims at best simulating the combination of AUs; In the last module, multilayer Restricted Boltzmann Machines (RBM) are exploited to learn hierarchical features, which are then concatenated for final expression recognition. Experiments on three expression databases CK+, MMI and SFEW demonstrate the effectiveness of AUDN in both lab-controlled and wild environments. All our results are better than or at least competitive to the best known results.
Keywords :
Boltzmann machines; face recognition; AU-aware deep network; AU-aware receptive field layer; AUDN; CK+ database; MMI database; RBM; SFEW database; action unit; convolution layer; deep architecture; expression-specific appearance variation; facial expression recognition; hierarchical feature learning; max-pooling layer; multilayer restricted Boltzmann machine; over-complete representation encoding; Accuracy; Databases; Face recognition; Iron; Logistics; Support vector machines;
Conference_Titel :
Automatic Face and Gesture Recognition (FG), 2013 10th IEEE International Conference and Workshops on
Conference_Location :
Shanghai
Print_ISBN :
978-1-4673-5545-2
Electronic_ISBN :
978-1-4673-5544-5
DOI :
10.1109/FG.2013.6553734