DocumentCode :
1797614
Title :
Towards generating random forests via extremely randomized trees
Author :
Le Zhang ; Ye Ren ; Suganthan, P.
Author_Institution :
Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore, Singapore
fYear :
2014
fDate :
6-11 July 2014
Firstpage :
2645
Lastpage :
2652
Abstract :
The classification error of a specified classifier can be decomposed into bias and variance. Decision tree based classifier has very low bias and extremely high variance. Ensemble methods such as bagging can significantly reduce the variance of such unstable classifiers and thus return an ensemble classifier with promising generalized performance. In this paper, we compare different tree-induction strategies within a uniform ensemble framework. The results on several public datasets show that random partition (cut-point for univariate decision tree or both coefficients and cut-point for multivariate decision tree) without exhaustive search at each node of a decision tree can yield better performance with less computational complexity.
Keywords :
computational complexity; decision trees; pattern classification; random processes; classification error; computational complexity; ensemble classifier; extremely randomized trees; generalized performance; multivariate decision tree; public datasets; random forest generation; random partition; tree-induction strategies; univariate decision tree; Accuracy; Bagging; Decision trees; Noise; Radio frequency; Training; Vegetation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
Type :
conf
DOI :
10.1109/IJCNN.2014.6889537
Filename :
6889537
Link To Document :
بازگشت