Author/Authors :
Salihu, Nasiru Department of Mathematics - School of Physical Science - Moddibo Adama University of Technology, Yola , Yusuf Waziri, Mohammed Department of Mathematics - School of Physical Sciences - Modibbo Adama University of Technology, Yola, Nigeria , Sani Halilu, Abubakar Department of Mathematics and Computer Science - Sule Lamido University, Ka n Hausa, Nigeria , Remilekun Odekunle, Mathew Department of Mathematics - School of Physical Sciences - Modibbo Adama University of Technology, Yola, Nigeria
Abstract :
There exist large varieties of conjugate gradient algorithms. In order to take advantage of the attractive features of Liu and Storey (LS) and Conjugate Descent (CD) conjugate gradient methods, we suggest hybridization of these methods in which the parameter is computed as a convex combination of and respectively which the conjugate gradient (update) parameter was obtained from Secant equation. The algorithm generates descent direction and when the iterate jam, the direction satisfy sufficient descent condition. We report numerical results demonstrating the efficiency of our method. The hybrid computational scheme outperform or comparable with known conjugate gradient algorithms. We also show that our method converge globally using strong Wolfe condition.
Keywords :
large scale optimization problem , Unconstrained optimization , conjugate gradient algorithm , secant equation , global convergence