DocumentCode :
1064770
Title :
An analysis of the gamma memory in dynamic neural networks
Author :
Principe, Jose C. ; Kuo, Jyh-Ming ; Celebi, Samel
Author_Institution :
Comput. NeuroEng. Lab., Florida Univ., Gainesville, FL, USA
Volume :
5
Issue :
2
fYear :
1994
fDate :
3/1/1994 12:00:00 AM
Firstpage :
331
Lastpage :
337
Abstract :
Presents a vector space framework to study short-term memory filters in dynamic neural networks. The authors define parameters to quantify the function of feedforward and recursive linear memory filters. They show, using vector spaces, what is the optimization problem solved by the PEs of the first hidden layer of the single input focused network architecture. Due to the special properties of the gamma bases, recursion brings an extra parameter λ (the time constant of the leaky integrator) that displaces the memory manifold towards the desired signal when the mean square error is minimized. In contrast, for the feedforward memory filter the angle between the desired signal and the memory manifold is fixed for a given memory order. The adaptation of the feedback parameter can be done using gradient descent, but the optimization is nonconvex
Keywords :
filters; memory architecture; neural nets; dynamic neural networks; feedforward memory filters; gamma memory; gradient descent; hidden layer; leaky integrator; memory manifold; optimization problem; recursive linear memory filters; short-term memory filters; single input focused network architecture; time constant; vector space framework; Information filtering; Information filters; Intelligent networks; Neural networks; Neurofeedback; Nonlinear filters; Signal mapping; Signal processing; System identification; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.279195
Filename :
279195
Link To Document :
بازگشت