Title :
Hough Forests for Object Detection, Tracking, and Action Recognition
Author :
Gall, Juergen ; Yao, Angela ; Razavi, Negin ; Van Gool, Luc ; Lempitsky, Victor
Author_Institution :
Dept. of Inf. Technol. & Electr. Eng., ETH Zurich, Zurich, Switzerland
Abstract :
The paper introduces Hough forests, which are random forests adapted to perform a generalized Hough transform in an efficient way. Compared to previous Hough-based systems such as implicit shape models, Hough forests improve the performance of the generalized Hough transform for object detection on a categorical level. At the same time, their flexibility permits extensions of the Hough transform to new domains such as object tracking and action recognition. Hough forests can be regarded as task-adapted codebooks of local appearance that allow fast supervised training and fast matching at test time. They achieve high detection accuracy since the entries of such codebooks are optimized to cast Hough votes with small variance and since their efficiency permits dense sampling of local image patches or video cuboids during detection. The efficacy of Hough forests for a set of computer vision tasks is validated through experiments on a large set of publicly available benchmark data sets and comparisons with the state-of-the-art.
Keywords :
Hough transforms; computer vision; gesture recognition; image matching; image motion analysis; learning (artificial intelligence); object detection; object tracking; Hough forests; Hough votes; Hough-based systems; action recognition; benchmark data sets; computer vision tasks; dense sampling; detection accuracy; fast matching; generalized Hough transform; implicit shape models; local image patches; object detection; object tracking; random forests; supervised training; task-adapted codebooks; video cuboids; Feature extraction; Object detection; Shape; Training; Transforms; Uncertainty; Vegetation; Hough transform; action recognition.; object detection; tracking;
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
DOI :
10.1109/TPAMI.2011.70