DocumentCode :
2695467
Title :
Stochastic competitive learning
Author :
Kosko, Bart
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
215
Abstract :
The probabilistic foundations of competitive learning systems are developed. Continuous and discrete formulations of unsupervised, supervised, and differential competitive learning systems are studied. These systems estimate an unknown probability density function from random pattern samples and behave as adaptive vector quantizers. Synaptic vectors in feedforward competitive neural networks quantize the pattern space and converge to pattern class centroids or local probability maxima. The stochastic calculus and a Lyapunov argument prove that competitive synaptic vectors converge to centroids exponentially quickly. Convergence does not depend on a specific dynamical model of how neuronal activations change
Keywords :
learning systems; neural nets; probability; stochastic processes; Lyapunov argument; competitive learning systems; competitive synaptic vectors; differential competitive learning; discrete formulations; feedforward competitive neural networks; local probability maxima; neuronal activations; pattern class centroids; pattern space; probabilistic foundations; stochastic calculus; stochastic competitive learning; unknown probability density function; vector quantizers;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137718
Filename :
5726677
Link To Document :
بازگشت