Abstract :
A neural network (NN) is said to be convergent (or completely stable) when each trajectory tends to an equilibrium point (a stationary state). A stronger property is that of absolute stability, which means that convergence holds for any choice of the neural network parameters, and any choice of the nonlinear functions, within specified and well characterized sets. In particular, the property of absolute stability requires that the NN be convergent also when, for some parameter values, it possesses nonisolated equilibrium points (e.g., a manifold of equilibria). Such a property, which is really well suited for solving several classes of signal processing tasks in real time, cannot be in general established via the classical LaSalle approach, due to its inherent limitations to study convergence in situations where the NN has nonisolated equilibrium points. A method to address absolute stability is developed, based on proving that the total length of the NN trajectories is finite. A fundamental result on absolute stability is given, under the hypothesis that the NN possesses a Lyapunov function, and the nonlinearities involved (neuron activations, inhibitions, etc.) are modeled by analytic functions. At the core of the proof of finiteness of trajectory length is the use of some basic inequalities for analytic functions due to Lojasiewicz. The result is applicable to a large class of neural networks, which includes the networks proposed by Vidyasagar, the Hopfield neural networks, and the standard cellular NN introduced by Chua and Yang.
Keywords :
Hopfield neural nets; Lyapunov methods; absolute stability; cellular neural nets; Hopfield neural networks; Lojasiewicz inequality; Lyapunov function; absolute stability; analytic neural networks; cellular neural networks; classical LaSalle approach; finite trajectory length; nonisolated equilibrium points; nonlinear functions; signal processing; stationary state; trajectory convergence; Cellular networks; Cellular neural networks; Convergence; Hopfield neural networks; Lyapunov method; Neural networks; Neurons; Signal processing; Stability analysis; Stationary state; 21;ojasiewicz inequality; 65; Absolute stability; finite trajectory length; neural networks; nonisolated equilibrium points; trajectory convergence;