DocumentCode
3246171
Title
The projection neural network
Author
Wilensky, Gregg D. ; Manukian, Narbik
Author_Institution
Logicon RDA, Los Angeles, CA, USA
Volume
2
fYear
1992
fDate
7-11 Jun 1992
Firstpage
358
Abstract
A novel neural network model, the projection neural network, is developed to overcome three key drawbacks of backpropagation-trained neural networks (BPNN), i.e., long training times, the large number of nodes required to form closed regions for classification of high-dimensional problems and the lack of modularity. This network combines advantages of hypersphere classifiers, such as the restricted Coulomb energy (RCE) network, radial basis function methods, and BPNN. It provides the ability to initialize nodes to serve either as hyperplane separators or as spherical prototypes (radial basis functions), followed by a modified gradient descent error minimization training of the network weights and thresholds, which adjusts the prototype positions and sizes and may convert closed prototype decision boundaries to open boundaries, and vice versa. The network can provide orders of magnitude decrease in the required training time over BPNN and a reduction in the number of required nodes. Theory and examples are given
Keywords
learning (artificial intelligence); neural nets; backpropagation-trained neural networks; classification; gradient descent error minimization training; high-dimensional problems; hyperplane separators; hypersphere classifiers; modularity; network weights; projection neural network; radial basis function methods; restricted Coulomb energy; thresholds; Backpropagation algorithms; Classification algorithms; Minimization methods; Neural networks; Particle separators; Pattern classification; Probability; Prototypes; Resonance; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location
Baltimore, MD
Print_ISBN
0-7803-0559-0
Type
conf
DOI
10.1109/IJCNN.1992.226961
Filename
226961
Link To Document