DocumentCode :
282555
Title :
Parallel, self-organizing hierarchical neural networks
Author :
Ersoy, O.K. ; Hong, D.
Author_Institution :
Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
Volume :
i
fYear :
1990
fDate :
2-5 Jan 1990
Firstpage :
158
Abstract :
A neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN) is discussed. The PSHNN involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each SNN, error detection is carried out, and a number of input vectors are rejected. Between two SNNs there is a nonlinear first SNN. The PSHNN has an optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and parallel architectures in which all SNNs operate simultaneously without waiting for data from each other during testing. In classification experiments with aircraft and satellite remote-sensing data the PSHNN is compared to multilayer networks using backpropagation training
Keywords :
neural nets; parallel architectures; aircraft data; backpropagation training; error detection; high classification accuracy; input vectors; minimized learning times; minimized recall times; multilayer networks; neural network architecture; nonlinear first SNN; optimized system complexity; parallel architectures; parallel self-organizing hierarchical neural network; satellite remote-sensing data; Artificial neural networks; Fault tolerance; Intelligent networks; Multi-layer neural network; Neural networks; Parallel architectures; Reflective binary codes; Robustness; Signal representations; Temperature;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
System Sciences, 1990., Proceedings of the Twenty-Third Annual Hawaii International Conference on
Conference_Location :
Kailua-Kona, HI
Type :
conf
DOI :
10.1109/HICSS.1990.205112
Filename :
205112
Link To Document :
بازگشت