• DocumentCode
    285226
  • Title

    Accelerated learning in multilayer networks

  • Author

    Atiya, A. ; Parlos, A. ; Muthusami, J. ; Fernandez, B. ; Tsai, W.

  • Author_Institution
    Dynamica Inc. Houston, TX, USA
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    925
  • Abstract
    An accelerated learning algorithm, adaptive backpropagation (ABP), is proposed for the supervised training of multilayer networks. The algorithm is based on the principle of forced dynamics for the error functional. The algorithm does not require use of any information from previous updates, while it requires knowledge of exactly the same error terms used in standard backpropagation. The numerical simulation results indicate that there are certain advantages to using ABP. The method is consistently about an order of magnitude faster than the standard backpropagation method, and also faster than such accelerated algorithms as quickprop. There is no added tuning parameter, other than the learning rate, to which ABP appears to be less sensitive. However, the drawbacks of jumpy behavior in the vicinity of the local minima and the inability to eventually reach the global minimum exist
  • Keywords
    backpropagation; feedforward neural nets; adaptive backpropagation; forced dynamics; jumpy behavior; learning rate; local minima; multilayer networks; quickprop; supervised training; Acceleration; Backpropagation algorithms; Control systems; Convergence; Force control; Force feedback; Intelligent networks; Linear feedback control systems; Nonhomogeneous media; Nonlinear control systems;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227081
  • Filename
    227081