DocumentCode :
1882806
Title :
Multimodal affect recognition in spontaneous HCI environment
Author :
Panning, A. ; Siegert, I. ; Al-Hamadi, A. ; Wendemuth, A. ; Rösner, D. ; Frommer, J. ; Krell, G. ; Michaelis, B.
Author_Institution :
Fac. of Electr. Eng. & Inf. Technol., Otto-von-Guericke Univ., Magdeburg, Germany
fYear :
2012
fDate :
12-15 Aug. 2012
Firstpage :
430
Lastpage :
435
Abstract :
Human Computer Interaction (HCI) is known to be a multimodal process. In this paper we will show results of experiments for affect recognition, with non-acted, affective multimodal data from the new Last Minute Corpus (LMC). This corpus is more related to real HCI applications than other known data sets where affective behavior is elicited untypically for HCI.We utilize features from three modalities: facial expressions, prosody and gesture. The results show, that even simple fusion architectures can reach respectable results compared to other approaches. Further we could show, that probably not all features and modalities contribute substantially to the classification process, where prosody and eye blink frequency seem most contributing in the analyzed dataset.
Keywords :
gesture recognition; human computer interaction; affective multimodal data; classification process; eye blink frequency; facial expression; gesture recognition; human computer interaction; last minute corpus; multimodal affect recognition; nonacted multimodal data; prosody recognition; spontaneous HCI environment; Face; Facial features; Feature extraction; Human computer interaction; Humans; Principal component analysis; Videos; Affect Recognition; HCI; Multimodal;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing, Communication and Computing (ICSPCC), 2012 IEEE International Conference on
Conference_Location :
Hong Kong
Print_ISBN :
978-1-4673-2192-1
Type :
conf
DOI :
10.1109/ICSPCC.2012.6335662
Filename :
6335662
Link To Document :
بازگشت