Title :
On global asymptotic stability of fully connected recurrent neural networks
Author :
Mandic, Danilo P. ; Chambers, Jonathon A. ; Bozic, Milorad M.
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
fDate :
6/22/1905 12:00:00 AM
Abstract :
Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence
Keywords :
asymptotic stability; convergence of numerical methods; iterative methods; optimisation; recurrent neural nets; contraction mapping theorem; convergence; existence; fixed point iteration; fully connected recurrent neural networks; global asymptotic stability; linear state dependence; nonlinear relaxation process; optimisation; robustness; upper bounds; Asymptotic stability; Constraint optimization; Convergence; Dynamic programming; Linear systems; Neural networks; Neurons; Nonlinear equations; Recurrent neural networks; Robustness;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2000. ICASSP '00. Proceedings. 2000 IEEE International Conference on
Print_ISBN :
0-7803-6293-4
DOI :
10.1109/ICASSP.2000.860132