DocumentCode :
276651
Title :
Learning algorithms and fixed dynamics
Author :
Cotter, Neil E. ; Conwell, Peter R.
Author_Institution :
Dept. of Electr. Eng., Utah Univ., Salt Lake City, UT, USA
Volume :
i
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
799
Abstract :
The authors discuss the equivalence of learning algorithms and nonlinear dynamic systems whose differential equations have fixed coefficients. They show how backpropagation transforms into a fixed-weight recursive neural network suitable for VLSI or optical implementations. The transformation is quite general and implies that understanding physiological networks may require one to determine the values of fixed parameters distributed throughout a network. Equivalently, a particular synaptic weight update mechanism such as Hebbian learning could likely be used to implement many known learning algorithms. The authors use the transformation process to illustrate why a network whose only variable weights are hidden-layer thresholds is capable of universal approximation
Keywords :
differential equations; learning systems; neural nets; Hebbian learning; VLSI; backpropagation; differential equations; fixed dynamics; fixed-weight recursive neural network; hidden-layer thresholds; learning algorithms; neural nets; nonlinear dynamic systems; optical implementations; synaptic weight update mechanism; universal approximation; Backpropagation algorithms; Differential equations; Hebbian theory; Heuristic algorithms; Neural networks; Nonlinear optics; Optical computing; Optical fiber networks; Transforms; Very large scale integration;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155280
Filename :
155280
Link To Document :
بازگشت