Title :
Growing and pruning neural tree networks
Author :
Sakar, A. ; Mammone, Richard J.
Author_Institution :
AT&T Bell Labs., Murray Hill, NJ, USA
fDate :
3/1/1993 12:00:00 AM
Abstract :
A pattern classification method called neural tree networks (NTNs) is presented. The NTN consists of neural networks connected in a tree architecture. The neural networks are used to recursively partition the feature space into subregions. Each terminal subregion is assigned a class label which depends on the training data routed to it by the neural networks. The NTN is grown by a learning algorithm, as opposed to multilayer perceptrons (MLPs), where the architecture must be specified before learning can begin. A heuristic learning algorithm based on minimizing the L1 norm of the error is used to grow the NTN. It is shown that this method has better performance in terms of minimizing the number of classification errors than the squared error minimization method used in backpropagation. An optimal pruning algorithm is given to enhance the generalization of the NTN. Simulation results are presented on Boolean function learning tasks and a speaker independent vowel recognition task. The NTN compares favorably to both neural networks and decision trees
Keywords :
learning (artificial intelligence); pattern recognition; self-organising feature maps; trees (mathematics); class label; classification errors; feature space; function learning tasks; learning algorithm; neural tree networks; optimal pruning algorithm; pattern classification method; speaker independent vowel recognition; Backpropagation algorithms; Boolean functions; Heuristic algorithms; Minimization methods; Multilayer perceptrons; Neural networks; Partitioning algorithms; Pattern classification; Speech recognition; Training data;
Journal_Title :
Computers, IEEE Transactions on