• DocumentCode
    3212432
  • Title

    A new super-memory gradient method for unconstrained optimization

  • Author

    Tang, Jingyong ; Dong, Li

  • Author_Institution
    Coll. of Math. & Inf. Sci., Xinyang Normal Univ., Xinyang, China
  • Volume
    1
  • fYear
    2010
  • fDate
    13-14 Sept. 2010
  • Firstpage
    93
  • Lastpage
    96
  • Abstract
    In this paper, we propose a new super-memory gradient method for unconstrained optimization problems. The global convergence and linear convergence rate are proved under some mild conditions. The method uses the current and previous iterative information to generate a new search direction and uses Wolfe line search to define the step-size at each iteration. It has a possibly simple structure and avoids the computation and storage of some matrices, which is suitable to solve large scale optimization problems. Numerical experiments show that the new algorithm is effective in practical computation in many situations.
  • Keywords
    convergence; gradient methods; matrix algebra; optimisation; Wolfe line search; global convergence; iterative information; linear convergence; search direction; super memory gradient method; unconstrained optimization; Convergence; Gradient methods; Iterative methods; Minimization; Search methods; Wolfe search rule; onvergence; super-memory gradient method; unconstrained optimization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computational Intelligence and Natural Computing Proceedings (CINC), 2010 Second International Conference on
  • Conference_Location
    Wuhan
  • Print_ISBN
    978-1-4244-7705-0
  • Type

    conf

  • DOI
    10.1109/CINC.2010.5643886
  • Filename
    5643886