Title :
Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks
Author :
Hu, Sanqing ; Wang, Jun
Author_Institution :
Dept. of Autom. & Comput.-Aided Eng., Chinese Univ. of Hong Kong, Shatin, China
fDate :
5/1/2002 12:00:00 AM
Abstract :
This paper presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing ones. We then extend an existing GAS result to GES one and also extend the existing GES results to more general cases with less restrictive connection weight matrices and/or partially Lipschitz activation functions
Keywords :
absolute stability; asymptotic stability; continuous time systems; recurrent neural nets; Lipschitz continuous activation functions; connection weight matrices; continuous-time recurrent neural networks; global asymptotic stability; global exponential stability; monotone nondecreasing activation functions; partially Lipschitz activation functions; sufficient conditions; testable sufficient conditions; Asymptotic stability; Automation; Councils; Linear matrix inequalities; Neural networks; Recurrent neural networks; Stability analysis; Sufficient conditions; Symmetric matrices; Testing;
Journal_Title :
Automatic Control, IEEE Transactions on
DOI :
10.1109/TAC.2002.1000277