DocumentCode
3014444
Title
Visual tracking and segmentation using appearance and spatial information of patches
Author
Wang, Junqiu ; Yagi, Yasushi
Author_Institution
Inst. of Sci. & Ind. Res., Osaka Univ., Osaka, Japan
fYear
2010
fDate
3-7 May 2010
Firstpage
4553
Lastpage
4558
Abstract
Object tracking and segmentation find a wide range of applications in robotics. Tracking and segmentation are difficult in cluttered and dynamic backgrounds. We propose a tracking and segmentation algorithm in which tracking and segmentation are performed consecutively. We separate input images into disjoint patches using an efficient oversegmentation algorithm. Objects and their background are described by bags of patches. We classify the patches in a new frame by searching k nearest neighbors. K-d trees are constructed using these patches to reduce computational complexity. Target location is estimated coarsely by running the mean-shift algorithm. Based on the estimated locations, we classify the patches again using appearance and spatial information. This strategy out-performs direct segmentation of patches based on appearance information only. Experimental results show that the proposed algorithm provides good performance on difficult sequences with clutter.
Keywords
image segmentation; robot vision; target tracking; trees (mathematics); K-d trees; k nearest neighbors; object tracking; patches; robotics; spatial information; visual segmentation; visual tracking; Classification tree analysis; Computational complexity; Histograms; Image segmentation; Nearest neighbor searches; Robotics and automation; Target tracking; USA Councils; Video sequences; Yagi-Uda antennas;
fLanguage
English
Publisher
ieee
Conference_Titel
Robotics and Automation (ICRA), 2010 IEEE International Conference on
Conference_Location
Anchorage, AK
ISSN
1050-4729
Print_ISBN
978-1-4244-5038-1
Electronic_ISBN
1050-4729
Type
conf
DOI
10.1109/ROBOT.2010.5509303
Filename
5509303
Link To Document