Title :
Improving convergence and solution quality of Hopfield-type neural networks with augmented Lagrange multipliers
Author_Institution :
Sch. of Electr. & Electron. Eng., Nanyang Technol. Inst., Singapore
fDate :
11/1/1996 12:00:00 AM
Abstract :
Hopfield-type networks convert a combinatorial optimization to a constrained real optimization and solve the latter using the penalty method. There is a dilemma with such networks: when tuned to produce good-quality solutions, they can fail to converge to valid solutions; and when tuned to converge, they tend to give low-quality solutions. This paper proposes a new method, called the augmented Lagrange-Hopfield (ALH) method, to improve Hopfield-type neural networks in both the convergence and the solution quality in solving combinatorial optimization. It uses the augmented Lagrange method, which combines both the Lagrange and the penalty methods, to effectively solve the dilemma. Experimental results on the travelling salesman problem (TSP) show superiority of the ALH method over the existing Hopfield-type neural networks in the convergence and solution quality. For the ten-city TSPs, ALH finds the known optimal tour with 100% success rate, as the result of 1000 runs with different random initializations. For larger size problems, it also finds remarkably better solutions than the compared methods while always converging to valid tours
Keywords :
Hopfield neural nets; convergence of numerical methods; optimisation; performance evaluation; travelling salesman problems; Hopfield-type neural networks; augmented Lagrange multipliers; augmented Lagrange-Hopfield method; combinatorial optimization; convergence; performance evaluation; solution quality; travelling salesman problem; Annealing; Application specific integrated circuits; Constraint optimization; Convergence; Hopfield neural networks; Lagrangian functions; Neural networks; Neurons; Optimization methods; Traveling salesman problems;
Journal_Title :
Neural Networks, IEEE Transactions on