Two Modified Hager and Zhang's Conjugate Gradient Algorithms For Solving Large-Scale Optimization Problems

Authors

  • Gonglin Yuan Guangxi University
  • Yong Li Baise University

DOI:

https://doi.org/10.24297/ijct.v11i5.1145

Keywords:

conjugate gradient, sufficient descent, global convergence

Abstract

At present, the conjugate gradient (CG) method of Hager and Zhang (Hager and Zhang, SIAM Journal on Optimization, 16(2005)) is regarded as one of the most effective CG methods for optimization problems. In order to further study the CG method, we develop the Hager and Zhang's CG method and present two modified CG formulas, where the given formulas possess the value information of not only the gradient but also the function.
 Moreover, the sufficient descent condition will be holden without any line search. The global convergence is established for nonconvex function under suitable conditions. Numerical results show that the proposed methods are competitive to the normal conjugate gradient method.

Downloads

Download data is not yet available.

Downloads

Published

2013-10-10

How to Cite

Yuan, G., & Li, Y. (2013). Two Modified Hager and Zhang’s Conjugate Gradient Algorithms For Solving Large-Scale Optimization Problems. INTERNATIONAL JOURNAL OF COMPUTERS &Amp; TECHNOLOGY, 11(5), 2586–2600. https://doi.org/10.24297/ijct.v11i5.1145

Issue

Section

Research Articles