DocumentCode :
2600625
Title :
Does multimodality really help? the classification of emotion and of On/Off-focus in multimodal dialogues - two case studies.
Author :
Nöth, Elmar ; Hacker, Christian ; Batliner, Anton
Author_Institution :
Univ. Erlangen-Nurnberg, Erlangen
fYear :
2007
fDate :
12-14 Sept. 2007
Firstpage :
9
Lastpage :
16
Abstract :
Very often in articles on monomodal human-machine-interaction (HMI) it is pointed out that the results can strongly be improved if other modalities are taken into account. In this contribution we look at two different problems in HMI: the detection of emotion or user state and the question whether the user is currently interacting with the machine, himself or another person (On/Off-Focus). We present monomodal classification results for these two problems and discuss whether multimodal classification seems to be promising for the respective problem. Different fusion models are considered. The examples are taken from the German HMI projects "SmartKom" and "SmartWeb".
Keywords :
human computer interaction; human factors; emotion detection; fusion models; monomodal classification; monomodal human-machine-interaction; multimodality; Cities and towns; Computer hacking; Displays; Graphics; Humans; Labeling; Microphones; Motion pictures; Speech synthesis; Telephony; Multimodal Human Machine Interaction;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
ELMAR, 2007
Conference_Location :
Zadar
ISSN :
1334-2630
Print_ISBN :
978-953-7044-05-3
Type :
conf
DOI :
10.1109/ELMAR.2007.4418790
Filename :
4418790
Link To Document :
بازگشت