• DocumentCode
    3267852
  • Title

    Robust object tracking using the particle filtering and level set methods: A comparative experiment

  • Author

    Luo, Cheng ; Cai, Xiongcai ; Zhang, Jian

  • Author_Institution
    Sch. of Comput. Sci. & Eng., Univ. of New South Wales, Sydney, NSW
  • fYear
    2008
  • fDate
    8-10 Oct. 2008
  • Firstpage
    359
  • Lastpage
    364
  • Abstract
    Robust visual tracking has become an important topic of research in computer vision. A novel method for robust object tracking, GATE [11], improves object tracking in complex environments using the particle filtering and the level set-based active contour method. GATE creates a spatial prior in the state space using shape information of the tracked object to filter particles in the state space in order to reshape and refine the posterior distribution of the particle filtering. This paper describes a comparative experiment that applies GATE and the standard particle filtering to track the object of interest in complex environments using simple features. Image sequences captured by the hand held, stationary and the PTZ camera are utilised. The experimental results demonstrate that GATE is able to solve the ambiguous outlier problem of particle filters in order to deal with heavy clutters in the background, occlusion, low resolution and noisy images, and thus significantly improves the particle filtering in object tracking.
  • Keywords
    filtering theory; image sequences; target tracking; GATE; PTZ camera; computer vision; image sequences; level set-based active contour method; particle filtering; posterior distribution; robust object tracking; shape information; visual tracking; Active contours; Computer vision; Image sequences; Information filtering; Information filters; Level set; Particle tracking; Robustness; Shape; State-space methods;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Multimedia Signal Processing, 2008 IEEE 10th Workshop on
  • Conference_Location
    Cairns, Qld
  • Print_ISBN
    978-1-4244-2294-4
  • Electronic_ISBN
    978-1-4244-2295-1
  • Type

    conf

  • DOI
    10.1109/MMSP.2008.4665104
  • Filename
    4665104