• DocumentCode
    1553522
  • Title

    On competitive learning

  • Author

    Wang, Lipo

  • Author_Institution
    Sch. of Comput. & Math., Deakin Univ., Geelong, Vic., Australia
  • Volume
    8
  • Issue
    5
  • fYear
    1997
  • fDate
    9/1/1997 12:00:00 AM
  • Firstpage
    1214
  • Lastpage
    1217
  • Abstract
    We derive learning rates such that all training patterns are equally important statistically and the learning outcome is independent of the order in which training patterns are presented, if the competitive neurons win the same sets of training patterns regardless the order of presentation. We show that under these schemes, the learning rules in the two different weight normalization approaches, the length-constraint and the sum-constraint, yield practically the same results, if the competitive neurons win the same sets of training patterns with both constraints. These theoretical results are illustrated with computer simulations
  • Keywords
    constraint handling; learning (artificial intelligence); neural nets; competitive learning; competitive neurons; learning rules; length-constraint; neural networks; sum-constraint; training patterns; weight normalization; Analytical models; Computer simulation; Convergence; Eigenvalues and eigenfunctions; Filters; Fluctuations; Gaussian distribution; Least squares approximation; Neural networks; Signal processing algorithms;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.623224
  • Filename
    623224