Abstract :
Recurrent neural networks belonging to the general class of nonlinear networks or systems with feedback can have a wide range of dynamical behaviors and can have potential for application in a number of areas. The issues central to recurrent neural networks are the architecture, neuro-dynamics, stability, and training or learning rules. The existing architectures for neuro-dynamical systems have been derived through an understanding/interpretation of biological neuro-systems. In this paper, we introduce a new, unique, and simple paradigm to arrive at a much broader class of neuro-dynamical systems. It is based on an “engineering or building block” approach combined with what one may term as “reverse engineering”. The concept of passivity is used to obtain the necessary nonlinear electrical elements that serve as the building blocks of complex dynamical systems. The dynamical equations corresponding to such systems can serve as the neuro-dynamics for various applications. The philosophy though conceptually simple, leads to a powerful paradigm for designing recurrent neural networks that are guaranteed to be stable and also leads to a self-organizing approach for learning, In this paper, we provide details on some of the building blocks, mathematical analysis of networks built from such an approach, the application to neural-dynamics, and some initial results
Keywords :
feedback; neural chips; neural net architecture; nonlinear network synthesis; recurrent neural nets; reverse engineering; architecture; biological neuro-systems; complex dynamical systems; dynamical behaviors; feedback; learning rules; neuro-dynamics; nonlinear electrical elements; passive nonlinear electrical building blocks; passivity; recurrent neural network architectures; reverse engineering; self-organizing learning; stability; training; Application software; Artificial neural networks; Computer architecture; Feedforward neural networks; Neural networks; Neurofeedback; Nonlinear equations; Recurrent neural networks; Reverse engineering; Stability;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on