DocumentCode :
977617
Title :
Structural properties of gradient recurrent high-order neural networks
Author :
Kosmatopoulos, Elias B. ; Christodoulou, Manolis A.
Author_Institution :
Dept. of Electron. & Comput. Eng., Tech. Univ. of Crete, Chania, Greece
Volume :
42
Issue :
9
fYear :
1995
fDate :
9/1/1995 12:00:00 AM
Firstpage :
592
Lastpage :
603
Abstract :
The structural properties of Recurrent High-Order Neural Networks (RHONN) whose weights are restricted to satisfy the symmetry property, are investigated. First, it is shown that these networks are gradient and stable dynamical systems and moreover, they remain stable when either bounded deterministic or multiplicative stochastic disturbances concatenate their dynamics. Then, we prove that such networks are capable of approximating arbitrarily close, a large class of dynamical systems of the form χ˙=F(χ). Appropriate learning laws, that make these neural networks able to approximate (identify) unknown dynamical systems are also proposed. The learning laws are based on Lyapunov stability theory, and they ensure error stability and robustness
Keywords :
Lyapunov methods; recurrent neural nets; stability; stochastic systems; Lyapunov stability theory; error stability; gradient recurrent high-order neural networks; learning laws; robustness; stable dynamical systems; structural properties; symmetry property; Differential equations; Hopfield neural networks; Lyapunov method; Neural networks; Neurons; Pattern recognition; Recurrent neural networks; Robust stability; Senior members; Stochastic systems;
fLanguage :
English
Journal_Title :
Circuits and Systems II: Analog and Digital Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1057-7130
Type :
jour
DOI :
10.1109/82.466645
Filename :
466645
Link To Document :
بازگشت