Title :
Paralleled hardware annealing for optimal solutions on electronic neural networks
Author :
Lee, Bang W. ; Sheu, Bing J.
Author_Institution :
Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA
fDate :
7/1/1993 12:00:00 AM
Abstract :
Three basic neural network schemes have been extensively studied by researchers: the iterative networks, the backpropagation networks, and the self-organizing networks. Simulated annealing is a probabilistic hill-climbing technique that accepts, with a nonzero but gradually decreasing probability, deterioration in the cost function of the optimization problems. Hardware annealing, which combines the simulated annealing technique with continuous-time electronic neural networks by changing the voltage gain of neurons, is discussed. The initial and final voltage gains for applying hardware annealing to Hopfield data-conversion networks are presented. In hardware annealing, the voltage gain of output neurons is increased from an initial low value to a final high value in a continuous fashion which helps to achieve the optimal solution for an optimization problem in one annealing cycle. Experimental results on the transfer function and transient response of electronic neural networks achieving the global minimum are also presented
Keywords :
Hopfield neural nets; mathematics computing; optimisation; parallel architectures; probability; transfer functions; transient response; Hopfield data-conversion networks; annealing cycle; backpropagation networks; cost function; electronic neural networks; iterative networks; optimization; parallel hardware; probabilistic hill-climbing technique; probability; self-organizing networks; simulated annealing; transfer function; transient response; voltage gain; Backpropagation; Cost function; Neural network hardware; Neural networks; Neurons; Self-organizing networks; Simulated annealing; Transfer functions; Transient response; Voltage;
Journal_Title :
Neural Networks, IEEE Transactions on