DocumentCode
3340366
Title
An Implementation of Multi-Modal Game Interface Based on PDAs
Author
Lee, Kue-Bum ; Kim, Jung-Hyun ; Hong, Kwang-Seok
Author_Institution
Sungkyunkwan Univ., Seoul
fYear
2007
fDate
20-22 Aug. 2007
Firstpage
759
Lastpage
768
Abstract
In computer animation and interactive computer games, gesture and speech modality can be a powerful interface between humans and computers. In this paper, we propose a personal digital assistant (PDA)- based multi-modal network game interface using speech, gesture and touch sensations. To verify the validity of our approach, we implement a multi-modal omok game using TCP/IP on a PDA network. The experimental results using the proposed multi-modal network game resulted in an average recognition rate of 97.4%, and accordingly as the weaknesses of uni- modality, such as incorrect command processing by recognition error, are offset by the strengths of other modalities, the user can enjoy a more interactive mobile game interface in any given environment.
Keywords
computer animation; computer games; gesture recognition; human computer interaction; interactive systems; mobile computing; speech recognition; TCP/IP; computer animation; gesture modality; human-computer interaction; interactive computer games; interactive mobile game interface; multimodal game interface; multimodal omok game; personal digital assistant; speech modality; Application software; Computer interfaces; Computer vision; Interactive systems; Mice; Personal digital assistants; Power engineering computing; Speech recognition; Virtual environment; Virtual reality;
fLanguage
English
Publisher
ieee
Conference_Titel
Software Engineering Research, Management & Applications, 2007. SERA 2007. 5th ACIS International Conference on
Conference_Location
Busan
Print_ISBN
0-7695-2867-8
Type
conf
DOI
10.1109/SERA.2007.48
Filename
4297013
Link To Document