Title :
Synthesizing neural networks by sequential addition of hidden nodes
Author :
Li, Qi ; Tufts, Donald W.
Author_Institution :
Dept. of Electr. Eng., Rhode Island Univ., Kingston, RI, USA
fDate :
27 Jun-2 Jul 1994
Abstract :
Presents a fast training algorithm with which one can sequentially determine the needed hidden nodes and the values of the associated weights for classification and pattern recognition. This new approach addresses problems in backpropagation and other gradient descent training algorithms. These problems include long training times, and the determination of the proper number of hidden nodes. We mitigate these difficulties by sequentially extracting important attributes of the training data in training each hidden node. The proposed algorithm separates the network training into the training of each layer of the network. The input layer is designed to partition the input data space using linear discriminant functions. The training starts from one hidden node. By applying a linear discriminant algorithm, a separable subset of the data is deleted from the training set. The remaining data is carried over to the training of the next hidden nodes. A hidden node is added to the network when it is needed because the classification performance on the training set is not yet good enough. Thus the training data set is reduced sequentially while the training is in progress. Each node of the output layer performs a logic function of the binary outputs of the hidden nodes. The training algorithm for the output layer is same as Boolean minimization
Keywords :
backpropagation; network synthesis; neural nets; pattern classification; Boolean minimization; backpropagation; binary outputs; classification performance; fast training algorithm; gradient descent; hidden nodes; input data space partitioning; linear discriminant functions; neural network synthesis; pattern recognition; separable data subset deletion; sequential addition; weight value determination; Backpropagation algorithms; Data mining; Logic functions; Minimization methods; Network synthesis; Neural networks; Partitioning algorithms; Pattern recognition; Performance analysis; Training data;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374263