DocumentCode
883148
Title
Growing and pruning neural tree networks
Author
Sakar, A. ; Mammone, Richard J.
Author_Institution
AT&T Bell Labs., Murray Hill, NJ, USA
Volume
42
Issue
3
fYear
1993
fDate
3/1/1993 12:00:00 AM
Firstpage
291
Lastpage
299
Abstract
A pattern classification method called neural tree networks (NTNs) is presented. The NTN consists of neural networks connected in a tree architecture. The neural networks are used to recursively partition the feature space into subregions. Each terminal subregion is assigned a class label which depends on the training data routed to it by the neural networks. The NTN is grown by a learning algorithm, as opposed to multilayer perceptrons (MLPs), where the architecture must be specified before learning can begin. A heuristic learning algorithm based on minimizing the L1 norm of the error is used to grow the NTN. It is shown that this method has better performance in terms of minimizing the number of classification errors than the squared error minimization method used in backpropagation. An optimal pruning algorithm is given to enhance the generalization of the NTN. Simulation results are presented on Boolean function learning tasks and a speaker independent vowel recognition task. The NTN compares favorably to both neural networks and decision trees
Keywords
learning (artificial intelligence); pattern recognition; self-organising feature maps; trees (mathematics); class label; classification errors; feature space; function learning tasks; learning algorithm; neural tree networks; optimal pruning algorithm; pattern classification method; speaker independent vowel recognition; Backpropagation algorithms; Boolean functions; Heuristic algorithms; Minimization methods; Multilayer perceptrons; Neural networks; Partitioning algorithms; Pattern classification; Speech recognition; Training data;
fLanguage
English
Journal_Title
Computers, IEEE Transactions on
Publisher
ieee
ISSN
0018-9340
Type
jour
DOI
10.1109/12.210172
Filename
210172
Link To Document