Title :
Interactively instructing a guide robot through a network
Author :
Hoshi, Yosuke ; Kobayashi, Yoshinori ; Kasuya, Tomoki ; Fueki, Masato ; Kuno, Yoshinori
Author_Institution :
Grad. Sch. of Sci. & Eng., Saitama Univ., Saitama
Abstract :
In this paper, we propose a remote-interactive mode for a museum guide robot. In this mode, the remote operator can interact with the robot by using voice and gestures through a network. The operator can instruct the robot what to do using nonverbal behaviors such as touching an object on the display screen while saying an instruction. For example, the operator can ask the robot, ldquoBring this brochure to himrdquo, while first touching the brochure and then the person on the display. The brochure is detected and tracked using the SIFT feature matching between video camera images. After the robot takes the brochure, the robot detects and tracks the person, and then hands it to him to the person.
Keywords :
human-robot interaction; image matching; mobile robots; video cameras; SIFT feature matching; guide robot; museum guide robot; remote-interactive mode; robot person detection; video camera images; Automatic control; Cameras; Control systems; Displays; Face detection; Humans; Natural languages; Robot control; Robot vision systems; Robotics and automation; SIFT feature matching; museum guide robot; nonverbal behavior; remote-interactive;
Conference_Titel :
Control, Automation and Systems, 2008. ICCAS 2008. International Conference on
Conference_Location :
Seoul
Print_ISBN :
978-89-950038-9-3
Electronic_ISBN :
978-89-93215-01-4
DOI :
10.1109/ICCAS.2008.4694399