• DocumentCode
    1842633
  • Title

    Efficient algorithm for training neural networks with one hidden layer

  • Author

    Wilamowski, Bogdan M. ; Chen, Yixin ; Malinowski, Aleksander

  • Author_Institution
    Dept. of EE, Wyoming Univ., Laramie, WY, USA
  • Volume
    3
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    1725
  • Abstract
    Efficient second order algorithm for training feedforward neural networks is presented. The algorithm has a similar convergence rate as the Lavenberg-Marquardt (LM) method and it is less computationally intensive and requires less memory. This is especially important for large neural networks where the LM algorithm becomes impractical. Algorithm was verified with several examples
  • Keywords
    computational complexity; convergence; feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; computational complexity; convergence rate; efficient second-order algorithm; feedforward neural network training; hidden neural layer; modified Lavenberg-Marquardt method; modified Levenberg-Marquardt method; Backpropagation algorithms; Convergence; Equations; Feedforward neural networks; Jacobian matrices; Neural networks; Neurons; Performance analysis; Stability; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1999. IJCNN '99. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-5529-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.1999.832636
  • Filename
    832636