Title :
Efficient classification for multiclass problems using modular neural networks
Author :
Anand, Rangachari ; Mehrotra, Kishan ; Mohan, Chilukuri K. ; Ranka, Sanjay
Author_Institution :
Sch. of Comput. & Inf. Sci., Syracuse Univ., NY, USA
fDate :
1/1/1995 12:00:00 AM
Abstract :
The rate of convergence of net output error is very low when training feedforward neural networks for multiclass problems using the backpropagation algorithm. While backpropagation will reduce the Euclidean distance between the actual and desired output vectors, the differences between some of the components of these vectors increase in the first iteration. Furthermore, the magnitudes of subsequent weight changes in each iteration are very small, so that many iterations are required to compensate for the increased error in some components in the initial iterations. Our approach is to use a modular network architecture, reducing a K-class problem to a set of K two-class problems, with a separately trained network for each of the simpler problems. Speedups of one order of magnitude have been obtained experimentally, and in some cases convergence was possible using the modular approach but not using a nonmodular network
Keywords :
backpropagation; convergence; feedforward neural nets; iterative methods; parallel architectures; pattern classification; K two-class problems; backpropagation; feedforward neural networks; modular network architecture; modular neural networks; multiclass problems classification; output error; rate of convergence; Computer science; Convergence; Euclidean distance; Feedforward neural networks; Helium; Information science; Neural networks; Standards development; Testing;
Journal_Title :
Neural Networks, IEEE Transactions on