Precompact convergence of the nonconvex Primal–Dual Hybrid Gradient algorithm

作者:

Highlights:

摘要

The Primal–Dual Hybrid Gradient (PDHG) algorithm is a powerful algorithm used quite frequently in recent years for solving saddle-point optimization problems. The classical application considers convex functions, and it is well studied in literature. In this paper, we consider the convergence of an alternative formulation of the PDHG algorithm in the nonconvex case under the precompact assumption. The proofs are based on the Kurdyka–Ł ojasiewic functions, that cover a wide range of problems. A simple numerical experiment illustrates the convergence properties.

论文关键词:Primal–Dual Hybrid Gradient algorithms,Kurdyka–Ł ojasiewic functions,Nonconvex optimization,Convergence analysis

论文评审过程:Received 14 February 2017, Accepted 31 July 2017, Available online 18 August 2017, Version of Record 6 September 2017.

论文官网地址:https://doi.org/10.1016/j.cam.2017.07.037