Two new conjugate gradient methods based on modified secant equations

作者:

Highlights:

摘要

Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposed methods is based on a modified version of the secant equation proposed by Zhang, Deng and Chen, and Zhang and Xu, and the other is based on the modified BFGS update proposed by Yuan. An interesting feature of our methods is their account of both the gradient and function values. Under proper conditions, we show that one of the proposed methods is globally convergent for general functions and that the other is globally convergent for uniformly convex functions. To enhance the performance of the line search procedure, we also propose a new approach for computing the initial steplength to be used for initiating the procedure. We provide a comparison of implementations of our methods with the efficient conjugate gradient methods proposed by Dai and Liao, and Hestenes and Stiefel. Numerical test results show the efficiency of our proposed methods.

论文关键词:Unconstrained optimization,Modified secant equation,Conjugacy condition,Conjugate gradient method,Global convergence

论文评审过程:Received 23 January 2008, Revised 30 January 2010, Available online 11 February 2010.

论文官网地址:https://doi.org/10.1016/j.cam.2010.01.052