Title :
AdaTree: Boosting a Weak Classifier into a Decision Tree
Author :
Grossmann, Etienne
Author_Institution :
University of Kentucky, Lexington
Abstract :
We present a boosting method that results in a decision tree rather than a fixed linear sequence of classifiers. An equally correct statement is that we present a tree-growing method whose performance can be analysed in the framework of Adaboost. We argue that Adaboost can be improved by presenting the input to a sequence of weak classifiers, each one tuned to the conditional probability determined by the output of previous weak classifiers. As a result, the final classifier has a tree structure, rather than being linear, thus the name "Adatree". One of the consequences of the tree structure is that different input data may have different processing time. Early experimentation shows a reduced computation cost with respect to Adaboost. One of our intended applications is real-time detection, where cascades of boosted detectors have recently become successful. The reduced computation cost of the proposed method shows some potential for being used directly in detection problems, without need of a cascade.
Keywords :
Boosting; Classification tree analysis; Computer Society; Computer vision; Conferences; Decision trees; Pattern recognition;
Conference_Titel :
Computer Vision and Pattern Recognition Workshop, 2004. CVPRW '04. Conference on
DOI :
10.1109/CVPR.2004.22