Abstract

Abstract

AN IMPROVED AND FAST CONJUGATE GRADIENT COEFFICIENT FOR LARGE-SCALE OPTIMIZATION WITH REDUCED COMPUTATION TIME

Ibrahim Abdullahi1, *, Usman Sani1, Baba Galadima Agaie1, and P.L. Ndukum2


Abstract Nonlinear conjugate gradient methods (CGMs) are widely used for solving unconstrained optimization problems. These methods are among the earliest known techniques for solving large-scale unconstrained optimization problems. In this paper, we propose a modified conjugate gradient coefficient (?_k). The new method possesses sufficient descent properties with Wolfe-Powell line search condition. The proposed method is globally convergent while the simulation results are obtained with strong Wolfe-Powell line search for the purpose of comparison. We employed performance profile to show the strength of the proposed method against some CGMs using some test problems. It is observed that the proposed method is effective as compared to some CGMs. Keywords Unconstrained optimization; Conjugate gradient method; Global convergence; Conjugate gradient coefficient.

PDF