DocumentCode :
2615892
Title :
Improving error tolerance of self-organizing neural nets
Author :
Sha, Fei ; Gan, Qiang
Author_Institution :
Lab. of Molecular & Biomolecular Electron., Southeast Univ., Nanjing, China
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
2716
Abstract :
A hybrid neural net (HNN) combining the network introduced by G.A. Carpenter and S. Grossberg (1987, 1988) and the Hopfield associative memory (HAM) is developed. HAM diminishes noise in samples and provides ART1 samples as inputs. In order to match the capacity of HAM with that of ART1, a new recalling algorithm (NHAM) is also introduced to enlarge the capacity of HAM. Based on NHAM and HNN, a revised version of HNN (RHNN) is introduced. The difference between RHNN and HNN is that RHNN has feedback loops, while HNN has only feedforward paths. The ART1 in RHNN supplies information for HAM to recall memories. Computer simulation demonstrated that RHNN has several advantages
Keywords :
content-addressable storage; neural nets; self-adjusting systems; ART1; Hopfield associative memory; error tolerance; feedback loops; feedforward paths; hybrid neural net; recalling algorithm; self-organizing neural nets; Computer simulation; Feedback loop; Filters; Forward error correction; Gallium nitride; Hopfield neural networks; Impedance matching; Molecular electronics; Neural networks; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170279
Filename :
170279
Link To Document :
بازگشت