DocumentCode
463518
Title
Markerless Monocular Tracking of Articulated Human Motion
Author
Haiying Liu ; Chellappa, Rama
Author_Institution
ObjectVideo Inc., Reston, VA, USA
Volume
1
fYear
2007
fDate
15-20 April 2007
Abstract
This paper presents a method for tracking general 3D general articulated human motion using a single camera with unknown calibration data. No markers, special clothes, or devices are assumed to be attached to the subject. In addition, both the camera and the subject are allowed to move freely, so that long-term view-independent human motion tracking and recognition are possible. We exploit the fact that the anatomical structure of the human body can be approximated by an articulated blob model. The optical flow under scaled orthographic projection is used to relate the spatial-temporal intensity change of the image sequence to the human motion parameters. These motion parameters are obtained by solving a set of linear equations to achieve global optimization. The correctness and robustness of the proposed method are demonstrated using Tai Chi sequences.
Keywords
cameras; gait analysis; image motion analysis; image recognition; image sequences; optimisation; Tai Chi sequences; anatomical structure; articulated blob model; articulated human motion; global optimization; human motion recognition; image sequence; linear equations; markerless monocular tracking; optical flow; scaled orthographic projection; single camera; spatial-temporal intensity; Anatomical structure; Biological system modeling; Calibration; Cameras; Equations; Humans; Image motion analysis; Image sequences; Robustness; Tracking; kinematics; machine vision; tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on
Conference_Location
Honolulu, HI
ISSN
1520-6149
Print_ISBN
1-4244-0727-3
Type
conf
DOI
10.1109/ICASSP.2007.366002
Filename
4217174
Link To Document