Title :
Context-dependent human-robot interaction using indicating motion via Virtual-City interface
Author :
Sato-Shimokawara, Eri ; Fukusato, Yusuke ; Nakazato, Jun ; Yamaguchi, Toru
Author_Institution :
Grad. Sch. of Syst. Design, Tokyo Metropolitan Univ., Tokyo
Abstract :
This paper presents interactive system using indicating motion which is used in human communication. Gesture motion has different means according to circumstances. Human recognize other personpsilas intention or attending points from gesture, face direction, situation and so on. Authors has researched gesture recognition considering situation for natural interaction between human and robot. Moreover, human find a object which was wanted other person from the interaction; To realize interactive system, authors construct Virtual-City interface.In this paper, authors describe the context-based gesture interaction using Virtual-City, and show the experiment that car-navigation using the system.
Keywords :
gesture recognition; interactive systems; man-machine systems; robot vision; user interfaces; virtual reality; car-navigation; context-based gesture interaction; context-dependent human-robot interaction; gesture motion; gesture recognition; human communication; indicating motion; interactive system; virtual-city interface; Ambient intelligence; Associative memory; Cameras; Cities and towns; Collaboration; Face recognition; Human robot interaction; Humanoid robots; Interactive systems; Tracking;
Conference_Titel :
Fuzzy Systems, 2008. FUZZ-IEEE 2008. (IEEE World Congress on Computational Intelligence). IEEE International Conference on
Conference_Location :
Hong Kong
Print_ISBN :
978-1-4244-1818-3
Electronic_ISBN :
1098-7584
DOI :
10.1109/FUZZY.2008.4630632