DocumentCode :
3583614
Title :
On global asymptotic stability of fully connected recurrent neural networks
Author :
Mandic, Danilo P. ; Chambers, Jonathon A. ; Bozic, Milorad M.
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
Volume :
6
fYear :
2000
fDate :
6/22/1905 12:00:00 AM
Firstpage :
3406
Abstract :
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence
Keywords :
asymptotic stability; convergence of numerical methods; iterative methods; optimisation; recurrent neural nets; contraction mapping theorem; convergence; existence; fixed point iteration; fully connected recurrent neural networks; global asymptotic stability; linear state dependence; nonlinear relaxation process; optimisation; robustness; upper bounds; Asymptotic stability; Constraint optimization; Convergence; Dynamic programming; Linear systems; Neural networks; Neurons; Nonlinear equations; Recurrent neural networks; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2000. ICASSP '00. Proceedings. 2000 IEEE International Conference on
ISSN :
1520-6149
Print_ISBN :
0-7803-6293-4
Type :
conf
DOI :
10.1109/ICASSP.2000.860132
Filename :
860132
Link To Document :
بازگشت