DocumentCode :
3734295
Title :
Embodied coversational agents: A methodology for learning to express facial emotions
Author :
Isidoros Perikos;Ioannis Hatzilygeroudis
Author_Institution :
University of Patras, Computer Engineering & Informatics Department, Patras, Greece
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
5
Abstract :
Embodied Conversational Agents (ECAs) constitute a special type of agents which can simulate verbal and non-verbal behaviors in order to achieve more natural interaction with humans. In this paper, we present a methodology that can be used to assist an ECA to generate and express facial emotions by properly deforming its facial characteristics. The methodology has been implemented on Greta agent and consists of two main stages. Initially, facial expressions are generated by Greta by setting random values to the Facial Animation Parameters (FAPs). Then, a Multi-Layer Perceptron Neural Network (MLPNN) is trained, based on the parameters of the facial deformation of the expressions and the corresponding expert´s annotations, to model and recognize the emotional content of new facial expressions. The expressions used for the training of the MLPNN were emotionally annotated by a human expert, who had determined for each one whether it is a neutral or an emotional expression and in case it is an emotional, had specified the emotions it conveys and their strength. In this approach, Greta can generate new facial expressions, recognize the emotional content of each one, and implement the facial expression of the desired emotional state. The evaluation study conducted showed very encouraging results.
Keywords :
"Face","Deformable models","Biological neural networks","Generators","Libraries","Neurons"
Publisher :
ieee
Conference_Titel :
Information, Intelligence, Systems and Applications (IISA), 2015 6th International Conference on
Type :
conf
DOI :
10.1109/IISA.2015.7388109
Filename :
7388109
Link To Document :
بازگشت