DocumentCode :
78348
Title :
A Machine Vision-Based Gestural Interface for People With Upper Extremity Physical Impairments
Author :
Hairong Jiang ; Duerstock, Bradley S. ; Wachs, Juan P.
Author_Institution :
Sch. of Ind. Eng., Purdue Univ., West Lafayette, IN, USA
Volume :
44
Issue :
5
fYear :
2014
fDate :
May-14
Firstpage :
630
Lastpage :
641
Abstract :
A machine vision-based gestural interface was developed to provide individuals with upper extremity physical impairments an alternative way to perform laboratory tasks that require physical manipulation of components. A color and depth based 3-D particle filter framework was constructed with unique descriptive features for face and hands representation. This framework was integrated into an interaction model utilizing spatial and motion information to deal efficiently with occlusions and its negative effects. More specifically, the suggested method proposed solves the false merging and false labeling problems characteristic in tracking through occlusion. The same feature encoding technique was subsequently used to detect, track and recognize users´ hands. Experimental results demonstrated that the proposed approach was superior to other state-of-the-art tracking algorithms when interaction was present (97.52% accuracy). For gesture encoding, dynamic motion models were created employing the dynamic time warping method. The gestures were classified using a conditional density propagation-based trajectory recognition method. The hand trajectories were classified into different classes (commands) with a recognition accuracy of 95.9%. In addition, the new approach was validated with the “one shot learning” paradigm with comparable results to those reported in 2012. In a validation experiment, the gestures were used to control a mobile service robot and a robotic arm in a laboratory chemistry experiment. Effective control policies were selected to achieve optimal performance for the presented gestural control system through comparison of task completion time between different control modes.
Keywords :
computer vision; gesture recognition; handicapped aids; image classification; image colour analysis; image motion analysis; image representation; learning (artificial intelligence); object recognition; object tracking; particle filtering (numerical methods); color based 3D particle filter framework; conditional density propagation-based trajectory recognition method; depth based 3D particle filter framework; descriptive features; face representation; false labeling problems; false merging problems; feature encoding technique; gesture classification; hand trajectories; hands representation; interaction model; laboratory chemistry experiment; laboratory tasks; machine vision-based gestural interface; mobile service robot; motion information; occlusion; one shot learning paradigm; people with upper extremity physical impairments; robotic arm; spatial information; user hands detection; user hands recognition; user hands tracking; Extremities; Face; Image color analysis; Robots; Sensors; Tracking; Trajectory; Condensation; dynamic time warping; gesture recognition; one shot learning; particle filter;
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics: Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2168-2216
Type :
jour
DOI :
10.1109/TSMC.2013.2270226
Filename :
6576857
Link To Document :
بازگشت