DocumentCode :
1547815
Title :
Omnivariate decision trees
Author :
Yildiz, Olcay Taner ; Alpaydin, Ethem
Author_Institution :
Dept. of Comput. Eng., Bogazici Univ., Istanbul, Turkey
Volume :
12
Issue :
6
fYear :
2001
fDate :
11/1/2001 12:00:00 AM
Firstpage :
1539
Lastpage :
1546
Abstract :
Univariate decision trees at each decision node consider the value of only one feature leading to axis-aligned splits. In a linear multivariate decision tree, each decision node divides the input space into two with a hyperplane. In a nonlinear multivariate tree, a multilayer perceptron at each node divides the input space arbitrarily, at the expense of increased complexity and higher risk of overfitting. We propose omnivariate trees where the decision node may be univariate, linear, or nonlinear depending on the outcome of comparative statistical tests on accuracy thus matching automatically the complexity of the node with the subproblem defined by the data reaching that node. Such an architecture frees the designer from choosing the appropriate node type, doing model selection automatically at each node. Our simulation results indicate that such a decision tree induction method generalizes better than trees with the same types of nodes everywhere and induces small trees
Keywords :
decision trees; learning (artificial intelligence); multilayer perceptrons; optimisation; complexity; decision node; learning; multilayer perceptron; multivariate tree; neural trees; omnivariate decision trees; optimisation; statistical tests; univariate decision trees; Automatic testing; Decision trees; Labeling; Multilayer perceptrons; Shape;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.963795
Filename :
963795
Link To Document :
بازگشت