• DocumentCode
    288387
  • Title

    Two adaptive stepsize rules for gradient descent and their application to the training of feedforward artificial neural networks

  • Author

    Mohandes, Mohmed ; Codrington, Craig W. ; Gelfand, Saul B.

  • Author_Institution
    Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    555
  • Abstract
    Gradient descent, in the form of the well-known backpropagation algorithm, is frequently used to train feedforward neural networks, i.e. to find the weights which minimize some error measure ε. Generally, the stepsize is fixed, and represents a compromise between stability and speed of convergence. In this paper, we derive two methods for adapting the stepsize and apply them to train neural networks on parity problems of various sizes
  • Keywords
    backpropagation; convergence of numerical methods; feedforward neural nets; adaptive stepsize rules; backpropagation; convergence; error measure; feedforward neural networks; gradient descent; learning; stability; Artificial neural networks; Backpropagation algorithms; Convergence; Feedforward neural networks; Joining processes; Neural networks; Neurons; Newton method; Stability; Time measurement;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374225
  • Filename
    374225