Shifted limited-memory variable metric methods for large-scale unconstrained optimization

作者:

Highlights:

摘要

A new family of numerically efficient full-memory variable metric or quasi-Newton methods for unconstrained minimization is given, which give simple possibility to derive related limited-memory methods. Global convergence of the methods can be established for convex sufficiently smooth functions. Numerical experience by comparison with standard methods is encouraging.

论文关键词:Unconstrained minimization,Variable metric methods,Limited-memory methods,Global convergence,Numerical results

论文评审过程:Received 15 June 2004, Revised 26 January 2005, Available online 7 April 2005.

论文官网地址:https://doi.org/10.1016/j.cam.2005.02.010