A modified Polak–Ribière–Polyak conjugate gradient algorithm for nonsmooth convex programs
作者:
Highlights:
•
摘要
The conjugate gradient (CG) method is one of the most popular methods for solving smooth unconstrained optimization problems due to its simplicity and low memory requirement. However, the usage of CG methods is mainly restricted to solving smooth optimization problems so far. The purpose of this paper is to present efficient conjugate gradient-type methods to solve nonsmooth optimization problems. By using the Moreau–Yosida regulation (smoothing) approach and a nonmonotone line search technique, we propose a modified Polak–Ribière–Polyak (PRP) CG algorithm for solving a nonsmooth unconstrained convex minimization problem. Our algorithm possesses the following three desired properties. (i) The search direction satisfies the sufficient descent property and belongs to a trust region automatically; (ii) the search direction makes use of not only the gradient information but also the function value information; and (iii) the algorithm inherits an important property of the well-known PRP method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening. Under standard conditions, we show that the algorithm converges globally to an optimal solution. Numerical experiment shows that our algorithm is effective and suitable for solving large-scale nonsmooth unconstrained convex optimization problems.
论文关键词:90C52,65K05,90C30,Nonsmooth convex optimization,Conjugate gradient method,Nonmonotone line search,Global convergence
论文评审过程:Received 1 August 2011, Revised 3 April 2013, Available online 26 April 2013.
论文官网地址:https://doi.org/10.1016/j.cam.2013.04.032