Title :
Task-relevant object detection and tracking
Author :
Yuncheng Li ; Jiebo Luo
Author_Institution :
Dept. of Comput. Sci., Univ. of Rochester, Rochester, NY, USA
Abstract :
Within the context of the Learning from Narrated Demonstration framework, a key vision component is to detect the task-relevant object for further processing. In this paper, we take advantage of the fact the task-relevant object is often connected to the supervisor´s hand and recast the problem as handheld object detection and tracking. Achieving robust handheld object detection and tracking has its own challenges, including arbitrary object appearance, viewpoint and non-rigid deformation. We propose a robust vision system that integrates speech information to perform handheld object detection via CRF, and MeanShift based tracking. Extensive evaluation on five sets of data has demonstrated the validity and robustness of the proposed system.
Keywords :
computer vision; object detection; object tracking; random processes; CRF; MeanShift based tracking; handheld object detection; handheld object tracking; narrated demonstration framework; nonrigid deformation; robust vision system; speech information; task-relevant object; vision component; CRF; LfD; Object Detection; Tracking;
Conference_Titel :
Image Processing (ICIP), 2013 20th IEEE International Conference on
Conference_Location :
Melbourne, VIC
DOI :
10.1109/ICIP.2013.6738803