An algorithm twisted from generalized ADMM for multi-block separable convex minimization models

作者:

Highlights:

摘要

The alternating direction method with multipliers (ADMM) has been one of most powerful and successful methods for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. It is known that the numerical efficiency is inherited for a large number of applications, but the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. This viewpoint was in fact the motivation for developing efficient algorithms that cannot only preserve the numerical advantages of the direct extension of ADMM but also guarantee convergence. One way is to correct the output of the direct extension of ADMM slightly via a simple correction step, and the other is to employ a simple proximal to solve inexactly each subproblem in the direct extension of ADMM. In this paper, in order to solve the multi-block separable convex minimization model efficiently, we present a method which is a combination of the above two ways, that is, we first solve each subproblem with a simple proximal, then we correct the output via a simple correction step. Theoretically, we derive global convergence results for this method and establish a worst-case O(1/k) iteration complexity. Numerically, the efficiency of this method can be showed by testing the problem of recovering low-rank and sparse components of matrices from incomplete and noisy observation.

论文关键词:Separable convex optimization,ADMM,Proximal ADMM,Augmented Lagrangian method

论文评审过程:Received 28 October 2015, Revised 22 January 2016, Available online 2 March 2016, Version of Record 29 August 2016.

论文官网地址:https://doi.org/10.1016/j.cam.2016.02.001