Some modified conjugate gradient methods for unconstrained optimization

作者:

Highlights:

摘要

Conjugate gradient methods are highly useful for solving large scale optimization problems because they do not require the storage of any matrices. Motivated by the construction of conjugate gradient parameters in some existing conjugate gradient methods, we propose four modified conjugate gradient methods, named NVLS, NVPRP*, NVHS* and NVLS* respectively, and prove that these methods with the strong Wolfe line search possess sufficient descent property, and are globally convergent when the parameter in line search conditions is restricted in some suitable interval. Preliminary numerical results show that the NVPRP*, NVHS* and NVLS* methods are more efficient than many existing conjugate gradient methods for a large number of test problems from a CUTEr collection.

论文关键词:90C30,90C25,Unconstrained optimization,Conjugate gradient method,Line search,Sufficient descent property,Global convergence

论文评审过程:Received 8 August 2015, Revised 26 March 2016, Available online 11 April 2016, Version of Record 26 April 2016.

论文官网地址:https://doi.org/10.1016/j.cam.2016.04.004