DocumentCode
615128
Title
Implicit video multi-emotion tagging by exploiting multi-expression relations
Author
Zhilei Liu ; Shangfei Wang ; Zhaoyu Wang ; Qiang Ji
Author_Institution
Sch. of Comput. Sci. & Technol., Univ. of Sci. & Technol. of China, Hefei, China
fYear
2013
fDate
22-26 April 2013
Firstpage
1
Lastpage
6
Abstract
In this paper, a novel implicit video multi-emotion tagging method is proposed, which considers the relations between the users´ outer facial expressions and inner emotions as well as the relations among multiple expressions. First, the audiences´ expressions are inferred through a multi-expression recognition model, which consists of an image driven expression measurement recognition and a Bayesian network representing the co-existence and mutual exclusion relations among multi-expressions. Second, the videos´ multi-emotion tags are obtained from the recognized expressions by another Bayesian Network, capturing the relations between outer expressions and inner emotions. Results of the experiments conducted on the NVIE database demonstrate that multi-expression recognition considering the relations among expressions improves the recognition performance. Furthermore, the performance of our proposed emotion tagging method considering the relations between outer expressions and inner emotions outperforms the traditional expression-based or image-based implicit video emotion tagging methods.
Keywords
Bayes methods; emotion recognition; face recognition; video signal processing; Bayesian network; image driven expression measurement recognition; implicit video multiemotion tagging; inner emotion; multiexpression recognition model; multiexpression relation; outer expression; outer facial expression; Accuracy; Bayes methods; Databases; Emotion recognition; Feature extraction; Image recognition; Tagging;
fLanguage
English
Publisher
ieee
Conference_Titel
Automatic Face and Gesture Recognition (FG), 2013 10th IEEE International Conference and Workshops on
Conference_Location
Shanghai
Print_ISBN
978-1-4673-5545-2
Electronic_ISBN
978-1-4673-5544-5
Type
conf
DOI
10.1109/FG.2013.6553767
Filename
6553767
Link To Document