DocumentCode :
2825845
Title :
Gesture + Play Exploring Full-Body Navigation for Virtual Environments
Author :
Tollmar, K. ; Demirdjian, D. ; Darrell, T.
Author_Institution :
MIT Artificial Intelligence Laboratory
Volume :
5
fYear :
2003
fDate :
16-22 June 2003
Firstpage :
47
Lastpage :
47
Abstract :
Navigating virtual environments usually requires a wired interface, game console, or keyboard. The advent of perceptual interface techniques allows a new option: the passive and untethered sensing of users´ pose and gesture to allow them maneuver through and manipulate virtual worlds. We describe new algorithms for interacting with 3-D environments using real-time articulated body tracking with standard cameras and personal computers. Our method is based on rigid stereo-motion estimation algorithms and uses a linear technique for enforcing articulation constraints. With our tracking system users can navigate virtual environments using 3-D gesture and body poses. We analyze the space of possible perceptual interface abstractions for full-body navigation, and present a prototype system based on these results. We finally describe an initial evaluation of our prototype system with users guiding avatars through a series of 3-D virtual game worlds.
Keywords :
Artificial intelligence; Cameras; Keyboards; Microcomputers; Navigation; Planets; Real time systems; Sensor systems; Virtual environment; Virtual prototyping;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition Workshop, 2003. CVPRW '03. Conference on
Conference_Location :
Madison, Wisconsin, USA
ISSN :
1063-6919
Print_ISBN :
0-7695-1900-8
Type :
conf
DOI :
10.1109/CVPRW.2003.10046
Filename :
4624307
Link To Document :
بازگشت