• DocumentCode
    2265154
  • Title

    Monocular 3D human pose estimation using sparse motion features

  • Author

    Daubney, Ben ; Gibson, David ; Campbell, Neill

  • Author_Institution
    Dept. of Comput. Sci., Univ. of Bristol, Bristol, UK
  • fYear
    2009
  • fDate
    Sept. 27 2009-Oct. 4 2009
  • Firstpage
    1050
  • Lastpage
    1057
  • Abstract
    In this paper we demonstrate that the motion of a sparse set of tracked features can be used to extract 3D pose from a single viewpoint. The purpose of this work is to illustrate the wealth of information present in the temporal dimension of a sequence of images that is currently not being exploited. Our approach is entirely dependent upon motion. We use low-level part detectors consisting of 3D motion models, these describe probabilistically how well the observed motion of a tracked feature fits each model. Given these initial detections a bottom-up approach is employed to find the most likely configuration of a person in each frame. Models used are learnt directly from motion capture data and no training is performed using descriptors derived from image sequences. The result is the presented approach can be applied to people moving at arbitrary and previously unseen orientations relative to the camera, making it particularly versatile and robust. We evaluate our approach for both walking and jogging on the HumanEva data set where we achieve an accuracy of 65.8±23.3 mm and 69.4±20.2 mm for each action respectively.
  • Keywords
    feature extraction; image sequences; motion estimation; pose estimation; human eva data set; image sequences; monocular 3d human pose estimation; sparse motion features; Cameras; Data mining; Detectors; Humans; Image sequences; Legged locomotion; Motion detection; Motion estimation; Robustness; Tracking;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Vision Workshops (ICCV Workshops), 2009 IEEE 12th International Conference on
  • Conference_Location
    Kyoto
  • Print_ISBN
    978-1-4244-4442-7
  • Electronic_ISBN
    978-1-4244-4441-0
  • Type

    conf

  • DOI
    10.1109/ICCVW.2009.5457586
  • Filename
    5457586