DocumentCode :
2694098
Title :
A stochastic training technique for feed-forward neural networks
Author :
Day, Shawn P. ; Camporese, Daniel S.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
607
Abstract :
A stochastic technique called stochastic tunneling is presented. The technique is used to train networks of neurons with step transfer functions. These networks are not trainable using backpropagation since the step function is not differentiable. The connection weights between neurons are all of unit magnitude and may be either excitatory or inhibitory. The simple nature of the connections makes this type of network very suitable for VLSI implementation. Such networks can perform binary vector pattern association tasks. Simulated annealing for training feedforward networks is investigated and compared to stochastic tunneling. Simulations show that stochastic tunneling is comparable to quenching with regard to the number of training epochs required when there are no local minima in the error surface. However, stochastic tunneling allows the escape from local minima in the error function, while quenching does not
Keywords :
learning systems; neural nets; simulated annealing; stochastic processes; binary vector pattern association; connection weights; error function; excitatory; feed-forward neural networks; inhibitory; local minima; quenching; simulated annealing; step transfer functions; stochastic tunneling; training epochs;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137637
Filename :
5726597
Link To Document :
بازگشت