DocumentCode :
328273
Title :
Self-organization in stochastic neural networks
Author :
Deco, G. ; Parra, L.
Author_Institution :
Corp. Res. & Dev., Siemens AG, Munich, Germany
Volume :
1
fYear :
1993
fDate :
25-29 Oct. 1993
Firstpage :
479
Abstract :
The maximization of the mutual information between the stochastic outputs neurons and the clamped inputs is used as an unsupervised criterion for training a Boltzmann machine. The resulting learning rule contains two terms corresponding to the Hebbian and anti-Hebbian learning. The two terms are weighted by the amount of transmitted information in the learning synapse, giving an information-theoretic interpretation to the proportionality constant given in the biological rule of Hebb. The anti-Hebbian term causes the convergence of weights. Simulation for the encoder problem demonstrates optimal performance of this method.
Keywords :
Boltzmann machines; Hebbian learning; information theory; neural nets; unsupervised learning; Boltzmann machine; Hebbian learning; anti-Hebbian learning; information-theory; proportionality constant; self-organization; stochastic neural networks; unsupervised learning; Equations; Information theory; Intelligent networks; Mutual information; Neural networks; Neurons; Research and development; Stochastic processes; Tin; Unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
Type :
conf
DOI :
10.1109/IJCNN.1993.713958
Filename :
713958
Link To Document :
بازگشت