Title :
A fast algorithm for solving large scale nonlinear optimization problems using RNN
Author :
Xu, Xiaoliang ; Tang, Huajin ; Shi, Xiaoxin
Author_Institution :
Inst. of Software & Intell. Technol., Hangzhou Dianzi Univ., Hangzhou
Abstract :
This paper presents a discrete-time recurrent neural network (RNN) model for solving nonlinear differentiable constrained optimization problems, which contain the special case of convex optimizations over constrained sets and variational inequality problem. The qualitative analysis results about the regularity and completeness of the proposed network have been obtained. It is shown that all trajectories starting from any initial point in Rfrn converge to the equilibrium set of the recurrent system. This RNN model shows its great simplicity in contrast to other existing neural network solvers. Simulations for a class of large scale linear complementarity problems illustrate the fast convergence and features of the proposed RNN model.
Keywords :
large-scale systems; optimisation; recurrent neural nets; discrete-time recurrent neural network; large scale linear complementarity problems; large scale nonlinear optimization problems; nonlinear differentiable constrained optimization; variational inequality problem; Constraint optimization; Convergence; Electronic mail; Intelligent networks; Large-scale systems; Mathematics; Neural networks; Paper technology; Recurrent neural networks; Software algorithms; Discrete time recurrent neural networks; convergence; nonlinear optimization; quadratic optimization;
Conference_Titel :
Cybernetics and Intelligent Systems, 2008 IEEE Conference on
Conference_Location :
Chengdu
Print_ISBN :
978-1-4244-1673-8
Electronic_ISBN :
978-1-4244-1674-5
DOI :
10.1109/ICCIS.2008.4670798