DocumentCode
3407882
Title
Sequential generation for visual tracking
Author
Lao, Yuanwei ; Zhu, Junda ; Zheng, Yuan F.
Author_Institution
Dept. of Electr. & Comput. Eng., Ohio State Univ., Columbus, OH
fYear
2008
fDate
March 31 2008-April 4 2008
Firstpage
953
Lastpage
956
Abstract
A novel variant of particle filters is presented, where new particles are generated sequentially by adapting the proposal density dynamically according to the likelihood of the current particle which are just generated. The new algorithm is able to capture more nonlinear motion and produce a better localization of the moving target in an efficient way. Experiments on both synthetic and real-world data verify its effectiveness and demonstrate its superiority over the generic particle filter.
Keywords
image motion analysis; object detection; particle filtering (numerical methods); tracking; generic particle filter; nonlinear motion; sequential particle generation; visual tracking; Bayesian methods; Cameras; Computational complexity; Particle filters; Particle measurements; Particle tracking; Proposals; Sampling methods; Target tracking; Time measurement; Particle filter; product enclosure; tracking;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on
Conference_Location
Las Vegas, NV
ISSN
1520-6149
Print_ISBN
978-1-4244-1483-3
Electronic_ISBN
1520-6149
Type
conf
DOI
10.1109/ICASSP.2008.4517769
Filename
4517769
Link To Document