DocumentCode
2597474
Title
Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space
Author
Park, Jeong Woo ; Lee, Hui Sung ; Jo, Su Hun ; Kim, Min-gyu ; Chung, Myung Jin
Author_Institution
Sch. of Electr. Eng. & Comput. Sci., KAIST, Daejeon
fYear
2008
fDate
1-3 Aug. 2008
Firstpage
225
Lastpage
230
Abstract
Recently, in the field of HRI, multimodal expression has been an issue. Synchronizing modalities and determining what modality to use are important aspect of multimodal expression. For example, when robots express emotional states, they may use only facial expressions or facial expressions with gestures, neck motions, sounds, etc. In this paper, emotional boundaries are proposed for multimodal expression in a three-dimensional affect space. The simultaneous expression of facial expression and gestures was demonstrated using proposed emotional boundaries on a simulator.
Keywords
emotion recognition; man-machine systems; robots; emotional boundary; human robot interaction; multimodal expression; three-dimensional linear affect-expression space; Computer science; Displays; Emotion recognition; Human robot interaction; Humanoid robots; Intelligent robots; Neck; Orbital robotics; Psychology; Service robots;
fLanguage
English
Publisher
ieee
Conference_Titel
Robot and Human Interactive Communication, 2008. RO-MAN 2008. The 17th IEEE International Symposium on
Conference_Location
Munich
Print_ISBN
978-1-4244-2212-8
Electronic_ISBN
978-1-4244-2213-5
Type
conf
DOI
10.1109/ROMAN.2008.4600670
Filename
4600670
Link To Document