Title :
A two step algorithm for designing small neural network trees
Author :
Takeda, Takaham ; Zhao, Qiangfu
Author_Institution :
Aizu Univ., Aizuwakamatsu, Japan
Abstract :
There are mainly two approaches for machine learning. One is symbolic approach and another is sub-symbolic approach. Decision tree (DT) is a typical model for symbolic learning, and neural network (NN) is a popular model for sub-symbolic learning. Neural network tree (NNTree) is a DT with each non-terminal node being an expert NN. NNTree is a learning model that may combine the advantages of both DT and NN. Through experiments we found that the size of an NNTree is usually proportional to the number of training data. Thus, we can produce small trees by using partial training data. In most cases, however, this will decrease the performance of the tree. In this paper, we propose a two-step algorithm to produce small NNTrecs. The first step is to get a small NNTree using partial data, and the second step is to increase the performance through retraining. The effectiveness of this algorithm is verified through experiments with public databases.
Keywords :
decision trees; learning (artificial intelligence); neural nets; decision tree; machine learning; neural network trees; partial training data; two step algorithm; Algorithm design and analysis; Concrete; Decision trees; Machine learning; Machine learning algorithms; Neural networks; Partitioning algorithms; Process design; Spatial databases; Training data;
Conference_Titel :
Neural Networks and Signal Processing, 2003. Proceedings of the 2003 International Conference on
Conference_Location :
Nanjing
Print_ISBN :
0-7803-7702-8
DOI :
10.1109/ICNNSP.2003.1279324