• DocumentCode
    322667
  • Title

    Modified EBP algorithm with instant training of the hidden layer

  • Author

    Wilamowski, Bogdan M.

  • Author_Institution
    Dept. of Electr. Eng., Wyoming Univ., Laramie, WY, USA
  • Volume
    3
  • fYear
    1997
  • fDate
    9-14 Nov 1997
  • Firstpage
    1097
  • Abstract
    Several algorithms for training feedforward neural networks including the steepest decent EBP (error backpropagation) and Lavenberg-Marquardt are compared. Various techniques to improve convergence of the EBP are also reviewed. A very fast training algorithm, with instant training of the hidden layer is introduced. For easy problems it has a similar convergence rate as the Lavenberg-Marquardt (LM) method. The algorithm sustains the fast convergence rate also for the cases when the LM algorithm fails and the EBP algorithm has practically unacceptable slow convergence rate
  • Keywords
    backpropagation; feedforward neural nets; Lavenberg-Marquardt algorithm; convergence rate; feedforward neural networks; hidden layer; instant training; steepest decent error backpropagation; Backpropagation algorithms; Convergence; Feedforward neural networks; Jacobian matrices; Least squares methods; Neural networks; Neurons; Newton method; Stability; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Industrial Electronics, Control and Instrumentation, 1997. IECON 97. 23rd International Conference on
  • Conference_Location
    New Orleans, LA
  • Print_ISBN
    0-7803-3932-0
  • Type

    conf

  • DOI
    10.1109/IECON.1997.668437
  • Filename
    668437