• DocumentCode
    348661
  • Title

    Effective neural network training with a different learning rate for each weight

  • Author

    Magoulas, G.D. ; Plagianakos, Vassilis P. ; Vrahatis, M.N.

  • Author_Institution
    Dept. of Inf., Athens Univ., Greece
  • Volume
    1
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    591
  • Abstract
    Batch training algorithms with a different learning rate for each weight are investigated. The adaptive learning rate algorithms of this class that apply inexact one-dimensional subminimization are analyzed and their global convergence is studied. Simulations are conducted to evaluate the convergence behavior of two training algorithms of this class and to compare them with several popular training methods
  • Keywords
    adaptive systems; convergence; learning (artificial intelligence); minimisation; neural nets; adaptive learning rate algorithms; batch training algorithms; different learning rate; error function minimization; global convergence; inexact one-dimensional subminimization; neural network training; simulations; weighting; Artificial neural networks; Convergence; Neural networks; Nonlinear equations; Tires;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Electronics, Circuits and Systems, 1999. Proceedings of ICECS '99. The 6th IEEE International Conference on
  • Conference_Location
    Pafos
  • Print_ISBN
    0-7803-5682-9
  • Type

    conf

  • DOI
    10.1109/ICECS.1999.812354
  • Filename
    812354