Title of article :
A class of gradient unconstrained minimization algorithms with adaptive stepsize
Author/Authors :
Vrahatis، نويسنده , , M.N. and Androulakis، نويسنده , , G.S. and Lambrinos، نويسنده , , J.N. and Magoulas، نويسنده , , G.D.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2000
Pages :
20
From page :
367
To page :
386
Abstract :
In this paper the development, convergence theory and numerical testing of a class of gradient unconstrained minimization algorithms with adaptive stepsize are presented. The proposed class comprises four algorithms: the first two incorporate techniques for the adaptation of a common stepsize for all coordinate directions and the other two allow an individual adaptive stepsize along each coordinate direction. All the algorithms are computationally efficient and possess interesting convergence properties utilizing estimates of the Lipschitz constant that are obtained without additional function or gradient evaluations. The algorithms have been implemented and tested on some well-known test cases as well as on real-life artificial neural network applications and the results have been very satisfactory.
Keywords :
Gradient method , Lipschitz constant , Line search strategies , Armijoיs method , Globally convergent method , Training algorithm , Unconstrained optimization , Steepest Descent , Artificial neural network
Journal title :
Journal of Computational and Applied Mathematics
Serial Year :
2000
Journal title :
Journal of Computational and Applied Mathematics
Record number :
1550731
Link To Document :
بازگشت