DocumentCode :
2971398
Title :
Multimodal interaction during multiparty dialogues: initial results
Author :
Cohen, Philip R. ; Coulston, Rachel ; Krout, Kelly
Author_Institution :
Dept. of Comput. Sci. & Eng., Oregon Health Sci. Univ., Portland, OR, USA
fYear :
2002
fDate :
2002
Firstpage :
448
Lastpage :
453
Abstract :
Groups of people involved in collaboration on a task often incorporate the objects in their mutual environment into their discussion. With this comes physical reference to these 3-D objects, including: gesture, gaze, haptics, and possibly other modalities, over and above the speech we commonly associate with human-human communication. From a technological perspective, this human style of communication not only poses the challenge for researchers to create multimodal systems capable of integrating input from various modalities, but also to do it well enough that it supports, but does not interfere with the primary goal of the collaborators, which is their own human-human interaction. This paper offers a first step towards building such multimodal systems for supporting face-to-face collaborative work by providing both qualitative and quantitative analyses of multiparty multimodal dialogues in a field setting.
Keywords :
gesture recognition; groupware; speech recognition; user interfaces; 3D objects; computer supported collaborative work; face-to-face collaborative work; gaze recognition; gesture recognition; haptic interface; human-human interaction; multimodal interaction; multiparty dialogues; speech recognition; Cameras; Collaboration; Collaborative work; Communications technology; Computer science; Haptic interfaces; Humans; Military computing; Physics computing; Speech;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Multimodal Interfaces, 2002. Proceedings. Fourth IEEE International Conference on
Print_ISBN :
0-7695-1834-6
Type :
conf
DOI :
10.1109/ICMI.2002.1167037
Filename :
1167037
Link To Document :
بازگشت