• DocumentCode
    1647847
  • Title

    An integrated algorithm of magnified gradient function and weight evolution for solving local minima problem

  • Author

    Ng, S.C. ; Leung, S.H. ; Luk, A.

  • Volume
    1
  • fYear
    2002
  • fDate
    6/24/1905 12:00:00 AM
  • Firstpage
    767
  • Lastpage
    772
  • Abstract
    This paper presents the integration of magnified gradient function and weight evolution algorithms in order to solve the local minima problem. The combination of the two algorithms gives a significant improvement in terms of the convergence rate and global search capability as compared to some common fast learning algorithms such as the standard backpropagation, Quickprop, resilient propagation, SARPROP, and genetic algorithms
  • Keywords
    backpropagation; convergence of numerical methods; feedforward neural nets; genetic algorithms; gradient methods; mathematics computing; Quickprop; SARPROP; backpropagation; convergence; feedforward neural networks; genetic algorithms; global search; learning algorithms; local minima problem; magnified gradient function; resilient propagation; weight evolution algorithms; Australia; Computational modeling; Convergence; Equations; Feedforward neural networks; Genetic algorithms; Investments; Neural networks; Optimization methods; Simulated annealing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
  • Conference_Location
    Honolulu, HI
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7278-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.2002.1005570
  • Filename
    1005570