DocumentCode :
295846
Title :
Theory and applications of sparsely interconnected feedback neural networks
Author :
Michel, Anthony N. ; Liu, Derong
Author_Institution :
Dept. of Electr. Eng., Notre Dame Univ., IN, USA
Volume :
2
fYear :
1995
fDate :
Nov/Dec 1995
Firstpage :
1070
Abstract :
This paper presents some developments in the analysis and design of a class of feedback neural networks with sparse interconnecting structure. The analysis results presented make it possible to determine whether a given vector is a stable memory of a neural network and to what extent implementation errors are permissible. The design methods presented allow the synthesis of neural networks with predetermined sparse interconnecting structures with or without symmetry constraints on the interconnection weights. An example is included to demonstrate the applicability of the methodology advanced herein
Keywords :
recurrent neural nets; implementation error permissibility; neural network synthesis; sparsely interconnected feedback neural networks; stable memory; symmetry constraints; Artificial intelligence; Design methodology; Equations; Gold; Intelligent networks; Network synthesis; Neural networks; Neurofeedback; Neurons; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.487570
Filename :
487570
Link To Document :
بازگشت