Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization

作者:

Highlights:

摘要

In this paper we introduce a new concept of approximate optimal stepsize for gradient method, use it to interpret the nice numerical effect of the Barzilai–Borwein (BB) method, and present several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. Based on revising some modified BFGS update formulae, we construct some new quadratic approximation models to develop several approximate optimal stepsizes. It is remarkable that these approximate optimal stepsizes lie in the intervals which contain the two well-known BB stepsizes. We then truncate these approximate optimal stepsizes by the two well-known BB stepsizes and treat the resulted approximate optimal stepsizes as the new stepsizes for gradient methods. Moreover, for the nonconvex case, we also design a new approximation model to generate an approximate optimal stepsize for gradient methods. We establish the convergences of the proposed methods under weaker condition. Numerical results show that the proposed methods are very promising.

论文关键词:Approximate optimal stepsize,Barzilai–Borwein (BB) method,Quadratic model,Gradient method,BFGS update formula

论文评审过程:Received 28 December 2016, Revised 22 July 2017, Accepted 31 July 2017, Available online 10 August 2017, Version of Record 31 August 2017.

论文官网地址:https://doi.org/10.1016/j.cam.2017.07.035