Two Modified Hager and Zhang's Conjugate Gradient Algorithms For Solving Large-Scale Optimization Problems
DOI:
https://doi.org/10.24297/ijct.v11i5.1145Keywords:
conjugate gradient, sufficient descent, global convergenceAbstract
At present, the conjugate gradient (CG) method of Hager and Zhang (Hager and Zhang, SIAM Journal on Optimization, 16(2005)) is regarded as one of the most effective CG methods for optimization problems. In order to further study the CG method, we develop the Hager and Zhang's CG method and present two modified CG formulas, where the given formulas possess the value information of not only the gradient but also the function.
Moreover, the sufficient descent condition will be holden without any line search. The global convergence is established for nonconvex function under suitable conditions. Numerical results show that the proposed methods are competitive to the normal conjugate gradient method.