DocumentCode :
387511
Title :
Designing transition networks for multimodal VR-interactions using a markup language
Author :
Latoschik, Marc Erich
Author_Institution :
AI & VR Lab. [28], Bielefeld Univ., Germany
fYear :
2002
fDate :
2002
Firstpage :
411
Lastpage :
416
Abstract :
This article presents one core component for enabling multimodal-speech and gesture-driven interaction in and for virtual environments. A so-called temporal Augmented Transition Network (tATN) is introduced. It allows to integrate and evaluate information from speech, gesture, and a given application context using a combined syntactic/semantic parse approach. This tATN represents the target structure for a multimodal integration markup language (MIML). MIML centers around the specification of multimodal interactions by letting an application designer declare temporal and semantic relations between given input utterance percepts and certain application states in a declarative and portable manner. A subsequent parse pass translates MIML into corresponding tATNs which are directly loaded and executed by a simulation engines scripting facility.
Keywords :
gesture recognition; hypermedia markup languages; speech recognition; user interfaces; virtual reality; MIML; XML; gesture-driven interaction; multimodal integration markup language; multimodal interface; semantic parse; simulation engine scripting facility; speech driven interaction; syntactic parse; temporal Augmented Transition Network; virtual environments; virtual reality; Artificial intelligence; Engines; History; Human computer interaction; Markup languages; Mice; Speech analysis; Virtual environment; Virtual reality; XML;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Multimodal Interfaces, 2002. Proceedings. Fourth IEEE International Conference on
Print_ISBN :
0-7695-1834-6
Type :
conf
DOI :
10.1109/ICMI.2002.1167030
Filename :
1167030
Link To Document :
بازگشت