Title :
On competitive learning
Author_Institution :
Sch. of Comput. & Math., Deakin Univ., Geelong, Vic., Australia
fDate :
9/1/1997 12:00:00 AM
Abstract :
We derive learning rates such that all training patterns are equally important statistically and the learning outcome is independent of the order in which training patterns are presented, if the competitive neurons win the same sets of training patterns regardless the order of presentation. We show that under these schemes, the learning rules in the two different weight normalization approaches, the length-constraint and the sum-constraint, yield practically the same results, if the competitive neurons win the same sets of training patterns with both constraints. These theoretical results are illustrated with computer simulations
Keywords :
constraint handling; learning (artificial intelligence); neural nets; competitive learning; competitive neurons; learning rules; length-constraint; neural networks; sum-constraint; training patterns; weight normalization; Analytical models; Computer simulation; Convergence; Eigenvalues and eigenfunctions; Filters; Fluctuations; Gaussian distribution; Least squares approximation; Neural networks; Signal processing algorithms;
Journal_Title :
Neural Networks, IEEE Transactions on