DocumentCode :
1590429
Title :
There You Go! - Estimating Pointing Gestures In Monocular Images For Mobile Robot Instruction
Author :
Richarz, J. ; Martin, C. ; Scheidig, A. ; Gross, H.-M.
Author_Institution :
Dept. of Neuroinformatics & Cognitive Robotics, Ilmenau Tech. Univ.
fYear :
2006
Firstpage :
546
Lastpage :
551
Abstract :
In this paper, we present a neural architecture that is capable of estimating a target point from a pointing gesture, thus enabling a user to command a mobile robot to a specific position in his local surroundings by means of pointing. In this context, we were especially interested to determine whether it is possible to implement a target point estimator using only monocular images of low-cost Webcams. The feature extraction is also quite straightforward: We use a gabor jet to extract the feature vector from the normalized camera images; and a cascade of multi layer perceptron (MLP) classifiers as estimator. The system was implemented and tested on our mobile robotic assistant HOROS. The results indicate that it is in fact possible to realize a pointing estimator using monocular image data, but further efforts are necessary to improve the accuracy and robustness of our approach
Keywords :
feature extraction; gesture recognition; mobile robots; multilayer perceptrons; neural net architecture; feature extraction; gabor jet; mobile robot instruction; monocular images; multi layer perceptron classifiers; pointing gestures; target point estimator; Cameras; Cognitive robotics; Feature extraction; Intelligent robots; Mobile communication; Mobile robots; Robot kinematics; Robot sensing systems; Robot vision systems; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robot and Human Interactive Communication, 2006. ROMAN 2006. The 15th IEEE International Symposium on
Conference_Location :
Hatfield
Print_ISBN :
1-4244-0564-5
Electronic_ISBN :
1-4244-0565-3
Type :
conf
DOI :
10.1109/ROMAN.2006.314446
Filename :
4107864
Link To Document :
بازگشت