Title :
Discrete-time convergence theory and updating rules for neural networks with energy functions
Author_Institution :
Sch. of Comput. & Math., Deakin Univ., Clayton, Vic., Australia
fDate :
3/1/1997 12:00:00 AM
Abstract :
We present convergence theorems for neural networks with arbitrary energy functions and discrete-time dynamics for both discrete and continuous neuronal input-output-functions. We discuss systematically how the neuronal updating rule should be extracted once an energy function is constructed for a given application, in order to guarantee the descent and minimization of the energy function as the network updates. We explain why the existing theory may lead to inaccurate results and oscillatory behaviors in the convergence process. We also point out the reason for and the side effects of using hysteresis neurons to suppress these oscillatory behaviors
Keywords :
convergence of numerical methods; dynamics; neural nets; optimisation; discrete-time convergence; discrete-time dynamics; energy functions; hysteresis neurons; minimization; network updates; neural networks; neuronal updating rules; oscillatory behaviors; Australia; Computer networks; Convergence; Cost function; Hopfield neural networks; Hysteresis; Mathematics; Neural networks; Neurons; Traveling salesman problems;
Journal_Title :
Neural Networks, IEEE Transactions on