DocumentCode :
2343314
Title :
Two Classes of Conjugate Gradient Methods for Large-Scale Unconstrained Optimization
Author :
Cao, Ming-yuan ; Yang, Yue-ting
Author_Institution :
Dept. of Math., Beihua Univ., Jilin, China
fYear :
2011
fDate :
15-19 April 2011
Firstpage :
37
Lastpage :
40
Abstract :
Two classes of new nonlinear conjugate gradient methods are proposed in order to avoid the drawbacks of FR and CD. By induction and contradiction, we prove the sufficient descent properties without any line search and the global convergence with the Wolfe line search. The numerical results for 10 classical unconstrained optimization problems respectively indicate that the proposed methods outperform other methods in terms of the iteration, function and gradient calls, etc. The new methods are effective.
Keywords :
conjugate gradient methods; optimisation; Wolfe line search; function calls; gradient calls; iteration calls; nonlinear conjugate gradient methods; unconstrained optimization problems; Convergence; Gradient methods; Operations research; Programming; SDRAM; conjugate gradient method; global convergence; line search; unconstrained optimization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Sciences and Optimization (CSO), 2011 Fourth International Joint Conference on
Conference_Location :
Yunnan
Print_ISBN :
978-1-4244-9712-6
Electronic_ISBN :
978-0-7695-4335-2
Type :
conf
DOI :
10.1109/CSO.2011.290
Filename :
5957606
Link To Document :
بازگشت