Title :
Pantomimic Gestures for Human–Robot Interaction
Author :
Burke, Michael ; Lasenby, Joan
Author_Institution :
Dept. of Eng., Univ. of Cambridge, Cambridge, UK
Abstract :
This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling.
Keywords :
Bayes methods; autonomous aerial vehicles; feature extraction; gesture recognition; human-robot interaction; image classification; mobile robots; principal component analysis; robot vision; Bayesian update step; feature extraction; human hand gesture classification; human-robot interaction; maximum information criterion; microUAV behavior recordings; nearest neighbor classifier; pantomimic gesture recognition strategy; principal component analysis; sequence labeling; unmanned aerial vehicle; weighted voting system; Feature extraction; Gesture recognition; Loading; Principal component analysis; Robots; Time series analysis; Trajectory; Gesture recognition; human???robot interaction; pantomimic; principal component analysis (PCA); time series classification;
Journal_Title :
Robotics, IEEE Transactions on
DOI :
10.1109/TRO.2015.2475956