DocumentCode :
3073994
Title :
Creating and Annotating Affect Databases from Face and Body Display: A Contemporary Survey
Author :
Gunes, Hatice ; Piccardi, Massimo
Author_Institution :
Univ. of Technol., Sydney
Volume :
3
fYear :
2006
fDate :
8-11 Oct. 2006
Firstpage :
2426
Lastpage :
2433
Abstract :
Databases containing representative samples of human multi-modal expressive behavior are needed for the development of affect recognition systems. However, at present publicly-available databases exist mainly for single expressive modalities such as facial expressions, static and dynamic hand postures, and dynamic hand gestures. Only recently, a first bimodal affect database consisting of expressive face and upper-body display has been released. To foster development of affect recognition systems, this paper presents a comprehensive survey of the current state-of-the art in affect database creation from face and body display and elicits the requirements of an ideal multi-modal affect database.
Keywords :
face recognition; gesture recognition; human computer interaction; visual databases; facial expression; gesture recognition; human multimodal expressive behavior; image database; Auditory displays; Computer displays; Computer interfaces; Computer science; Emotion recognition; Face recognition; Humans; Speech recognition; Testing; Visual databases;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man and Cybernetics, 2006. SMC '06. IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
1-4244-0099-6
Electronic_ISBN :
1-4244-0100-3
Type :
conf
DOI :
10.1109/ICSMC.2006.385227
Filename :
4274233
Link To Document :
بازگشت