Title :
Evaluation of multimodal sequential expressions of emotions in ECA
Author :
Rados?aw Niewiadomski;Sylwia Hyniewska;Catherine Pelachaud
Author_Institution :
Telecom ParisTech, 37/39, rue Dareau - 75014 Paris, France
Abstract :
A model of multimodal sequential expressions of emotion for an Embodied Conversational Agent was developed. The model is based on video annotations and on descriptions found in the literature. A language has been derived to describe expressions of emotions as a sequence of facial and body movement signals. An evaluation study of our model is presented in this paper. Animations of 8 sequential expressions corresponding to the emotions — anger, anxiety, cheerfulness, embarrassment, panic fear, pride, relief, and tension — were realized with our model. The recognition rate of these expressions is higher than the chance level making us believe that our model is able to generate recognizable expressions of emotions, even for the emotional expressions not considered to be universally recognized.
Keywords :
"Emotion recognition","Displays","Facial animation","Telecommunications","Computational modeling","Art","Head","Data mining","Humans","Financial advantage program"
Conference_Titel :
Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on
Print_ISBN :
978-1-4244-4800-5
Electronic_ISBN :
2156-8111
DOI :
10.1109/ACII.2009.5349569