DocumentCode :
720666
Title :
Egocentric articulated pose tracking for action recognition
Author :
Yonemoto, Haruka ; Murasaki, Kazuhiko ; Osawa, Tatsuya ; Sudo, Kyoko ; Shimamura, Jun ; Taniguchi, Yukinobu
Author_Institution :
NTT Media Intell. Labs., NTT Corp., Kanagawa, Japan
fYear :
2015
fDate :
18-22 May 2015
Firstpage :
98
Lastpage :
101
Abstract :
Many studies on action recognition from the third-person viewpoint have shown that articulated human pose can directly describe human motion and is invariant to view change. However, conventional algorithms that estimate articulated human pose cannot handle ego-centric images because they assume the whole figure appears in the image; only a few parts of the body appear in ego-centric images. In this paper, we propose a novel method to estimate human pose for action recognition from ego-centric RGB-D images. Our method can extract the pose by integrating hand detection, camera pose estimation, and time-series filtering with the constraint of body shape. Experiments show that joint positions are well estimated when the detection error of hands and arms decreases. We demonstrate that the accuracy of action recognition is improved by the feature of skeleton when the action contains unintended view changes.
Keywords :
feature extraction; gesture recognition; image colour analysis; image filtering; image motion analysis; pose estimation; time series; action recognition; articulated human pose estimation; egocentric RGB-D image; egocentric articulated pose tracking; hand detection; human motion; skeleton feature extraction; time-series filtering; Cameras; Estimation; Hidden Markov models; Joints; Three-dimensional displays; Tracking;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Vision Applications (MVA), 2015 14th IAPR International Conference on
Conference_Location :
Tokyo
Type :
conf
DOI :
10.1109/MVA.2015.7153142
Filename :
7153142
Link To Document :
بازگشت