Title :
Lagrangian method for satisfiability problems of propositional calculus
Author :
Nagamatu, Masahiro ; Yanaru, Torao
Author_Institution :
Dept. of Electr., Electron. & Comput. Eng., Kyushu Inst. of Technol., Fukuoka, Japan
Abstract :
Hopfield type neural networks for solving difficult combinatorial optimization problems have used gradient descent algorithms to solve constrained optimization problems via penalty functions. However, it is well known that the convergence to local minima is inevitable in these approaches. Lagrange programming neural networks have been proposed. They differ from the gradient descent algorithms by using anti-descent terms in their dynamical differential equations. We analyze the stability and the convergence property of the Lagrangian method when it is applied to a satisfiability problem of propositional calculus
Keywords :
algorithm theory; calculus; computability; convergence of numerical methods; differential equations; neural nets; numerical stability; optimisation; Hopfield type neural networks; Lagrange programming neural networks; Lagrangian method; anti-descent terms; combinatorial optimization problems; constrained optimization problems; convergence property; dynamical differential equations; gradient descent algorithms; penalty functions; propositional calculus; satisfiability problems; stability; Calculus; Constraint optimization; Convergence; Differential equations; Electronic mail; Hopfield neural networks; Lagrangian functions; Neural networks; Page description languages; Stability analysis;
Conference_Titel :
Artificial Neural Networks and Expert Systems, 1995. Proceedings., Second New Zealand International Two-Stream Conference on
Conference_Location :
Dunedin
Print_ISBN :
0-8186-7174-2
DOI :
10.1109/ANNES.1995.499442