DocumentCode :
1903639
Title :
Scalability in neural network design: an add-on scheme
Author :
Tseng, H. Chris ; Liu, J. Gary
Author_Institution :
Dept. of Electr. Eng., Santa Clara Univ., CA, USA
fYear :
1993
fDate :
1993
Firstpage :
444
Abstract :
The authors investigate how a neural network architecture can be expanded with add-on neural networks to deal with a scale-up problem. It is shown that if the new scale-up associative memory design problem is solvable, then the authors´ proposed scheme is also valid. A scale-up methodology is developed which allows an increase in dimension for each stored pattern in a dynamic neural network by augmenting additional neural subnetworks, while preserving the interconnection architecture of the original neural network. The proposed scale-up approach eliminates the need for redesigning an overall new neural network architecture. The ease of design is illustrated by the stability requirement of the neural network. The add-on feature proves to be efficient with respect to the design of the interconnection weights
Keywords :
content-addressable storage; neural nets; pattern recognition; stability; add-on scheme; associative memory design; interconnection architecture; interconnection weights; neural network design; scale-up problem; stability requirement; stored pattern; Associative memory; Databases; Hopfield neural networks; Intelligent control; Intelligent networks; Laboratories; Neural networks; Neurons; Scalability; Stability;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
Type :
conf
DOI :
10.1109/ICNN.1993.298598
Filename :
298598
Link To Document :
بازگشت