DocumentCode :
1728951
Title :
Information Forests
Author :
Yi, Zhao ; Soatto, Stefano ; Dewan, Maneesh ; Zhan, Yiqiang
Author_Institution :
Univ. of California, Los Angeles, CA, USA
fYear :
2012
Firstpage :
143
Lastpage :
146
Abstract :
We describe Information Forests, an approach to classification that generalizes Random Forests by replacing the splitting criterion of non-leaf nodes from a discriminative one - based on the entropy of the label distribution - to a generative one - based on maximizing the information divergence between the class-conditional distributions in the resulting partitions. The basic idea consists of deferring classification until a measure of “classification confidence” is sufficiently high, and instead breaking down the data so as to maximize this measure. In an alternative interpretation, Information Forests attempt to partition the data into subsets that are “as informative as possible” for the purpose of the task, which is to classify the data. Classification confidence, or informative content of the subsets, is quantified by the Information Divergence. Our approach relates to active learning, semi-supervised learning, mixed generative/discriminative learning.
Keywords :
entropy; image classification; active learning; class conditional distributions; classification confidence; entropy; information divergence; information forests; informative content; interpretation; label distribution; random forests; resulting partitions; semi supervised learning; splitting criterion; Approximation methods; Decision trees; Entropy; Indexes; Silicon; Training; Vegetation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory and Applications Workshop (ITA), 2012
Conference_Location :
San Diego, CA
Print_ISBN :
978-1-4673-1473-2
Type :
conf
DOI :
10.1109/ITA.2012.6181810
Filename :
6181810
Link To Document :
بازگشت