• DocumentCode
    857499
  • Title

    Convergence of gradient method with momentum for two-Layer feedforward neural networks

  • Author

    Zhang, Naimin ; Wu, Wei ; Zheng, Gaofeng

  • Author_Institution
    Math. & Inf. Sci. Coll., Wenzhou Univ., China
  • Volume
    17
  • Issue
    2
  • fYear
    2006
  • fDate
    3/1/2006 12:00:00 AM
  • Firstpage
    522
  • Lastpage
    525
  • Abstract
    A gradient method with momentum for two-layer feedforward neural networks is considered. The learning rate is set to be a constant and the momentum factor an adaptive variable. Both the weak and strong convergence results are proved, as well as the convergence rates for the error function and for the weight. Compared to the existing convergence results, our results are more general since we do not require the error function to be quadratic.
  • Keywords
    convergence; feedforward neural nets; gradient methods; method of moments; adaptive variable; error function; gradient method convergence; learning rate; momentum factor; two-layer feedforward neural networks; Convergence; Defense industry; Feedforward neural networks; Gradient methods; Information science; Mathematics; Minimization methods; Multi-layer neural network; Neural networks; Convergence; feedforward neural network; gradient method; momentum; Algorithms; Artificial Intelligence; Computer Simulation; Models, Theoretical; Neural Networks (Computer); Numerical Analysis, Computer-Assisted; Pattern Recognition, Automated;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/TNN.2005.863460
  • Filename
    1603637