On three-term conjugate gradient algorithms for unconstrained optimization
作者:
Highlights:
•
摘要
This paper presents a project for three-term conjugate gradient algorithms development. The search direction of the algorithms from this class has three terms and is computed as modifications of the classical conjugate gradient algorithms to satisfy both the descent and the conjugacy conditions. An example of three-term conjugate gradient algorithm from this class, as modifications of the classical and well known Hestenes and Stiefel or of the CG_DESCENT by Hager and Zhang conjugate gradient algorithms, satisfying both the descent and the conjugacy conditions is presented. These properties are independent of the line search. Also, the algorithm could be considered as a modification of the memoryless BFGS quasi-Newton method. The new approximation of the minimum is obtained by the general Wolfe line search and using by now a standard acceleration technique developed by Andrei. The proposed three-term conjugate gradient algorithm substantially outperforms the well known Hestenes and Stiefel conjugate gradient algorithm, as well as the more elaborated CG_DESCENT algorithm. A number of 5 applications from MINPACK-2 test problems collection, with 106 variables, prove that the suggested three-term conjugate gradient algorithm is top performer versus CG_DESCENT.
论文关键词:Unconstrained optimization,Three-term conjugate gradient,Descent condition,Conjugacy condition,Numerical comparisons
论文评审过程:Available online 23 January 2013.
论文官网地址:https://doi.org/10.1016/j.amc.2012.11.097