DocumentCode :
920522
Title :
Parallel, self-organizing, hierarchical neural networks. II
Author :
Ersoy, Okan K. ; Hong, Daesik
Author_Institution :
Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
Volume :
40
Issue :
2
fYear :
1993
fDate :
4/1/1993 12:00:00 AM
Firstpage :
218
Lastpage :
227
Abstract :
For pt.I see IEEE Trans. Neural Networks, vol.1, p.167-78 (1990). Parallel, self-organizing, hierarchical neural networks (PSHNNs) involve a number of stages with error detection at the end of each stage, i.e., rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on-off or bipolar switching operations for bulk computations and attenuators for pointwise operations
Keywords :
backpropagation; error detection; maximum likelihood estimation; self-organising feature maps; attenuators; backpropagation network; bipolar switching operations; error detection; hierarchical neural networks; maximum likelihood method; nonlinear transformation; parallel neural networks; programmable implementations; self-organizing; Attenuators; Backpropagation; Computer architecture; Fault detection; Fault tolerance; Maximum likelihood detection; Mechanical factors; Neural networks; Robustness; Testing;
fLanguage :
English
Journal_Title :
Industrial Electronics, IEEE Transactions on
Publisher :
ieee
ISSN :
0278-0046
Type :
jour
DOI :
10.1109/41.222643
Filename :
222643
Link To Document :
بازگشت