DocumentCode :
2626999
Title :
Towards user-independent classification of multimodal emotional signals
Author :
Kim, Jonghwa ; André, Elisabeth ; Vogt, Thurid
Author_Institution :
Augsburg Univ., Augsburg, Germany
fYear :
2009
fDate :
10-12 Sept. 2009
Firstpage :
1
Lastpage :
7
Abstract :
Coping with differences in the expression of emotions is a challenging task not only for a machine, but also for humans. Since individualism in the expression of emotions may occur at various stages of the emotion generation process, human beings may react quite differently to the same stimulus. Consequently, it comes as no surprise that recognition rates reported for a user-dependent system are significantly higher than recognition rates for a user-independent system. Based on empirical data we obtained in our earlier work on the recognition of emotions from biosignals, speech and their combination, we discuss which consequences arise from individual user differences for automated recognition systems and outline how these systems could be adapted to particular user groups.
Keywords :
emotion recognition; automated recognition systems; biosignals; emotion generation process; multimodal emotional signals; recognition rates; user independent classification; user-dependent system; user-independent system; Appraisal; Audio recording; Automatic speech recognition; Emotion recognition; Humans; Machine learning algorithms; Pattern recognition; Psychology; Speech analysis; System testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on
Conference_Location :
Amsterdam
Print_ISBN :
978-1-4244-4800-5
Electronic_ISBN :
978-1-4244-4799-2
Type :
conf
DOI :
10.1109/ACII.2009.5349495
Filename :
5349495
Link To Document :
بازگشت