DocumentCode
3486833
Title
Stability analysis OS discrete-time recurrently connected neural network
Author
Chen, Tianping ; Lu, Wen Lian
Author_Institution
Lab. of Nonlinear Math. Sci., Fudan Univ., Shanghai, China
Volume
1
fYear
2002
fDate
18-22 Nov. 2002
Firstpage
372
Abstract
In this paper, we discuss dynamics of the discrete-time recurrently asymmetrically connected neural networks (DTRACNN). We propose an effective approach to study global stability of the networks. We give some sufficient conditions for the discrete-time recurrently asymmetrically connected neural networks (DRACNN) being exponentially stable. We also give a bound of the step size such that the iteration converges. As a consequence, we derive the exponential stability of continuous-time recurrently asymmetrically connected neural networks (CTRACNN), i.e., the systems that are also controlled by differential equations.
Keywords
asymptotic stability; convergence of numerical methods; differential equations; discrete time systems; iterative methods; neural nets; differential equations; discrete-time recurrently connected neural network; exponential stability; global exponential convergence; iterative method; sufficient conditions; Chaos; Control systems; Convergence; Differential equations; Large-scale systems; Lyapunov method; Neural networks; Recurrent neural networks; Stability analysis; Sufficient conditions;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN
981-04-7524-1
Type
conf
DOI
10.1109/ICONIP.2002.1202196
Filename
1202196
Link To Document