Optimization Online


On the Direct Extension of ADMM for Multi-block Separable Convex Programming and Beyond: From Variational Inequality Perspective

Bingsheng He (hebma***at***nju.edu.cn)
Xiaoming Yuan (xmyuan***at***hkbu.edu.hk)

Abstract: When the alternating direction method of multipliers (ADMM) is extended directly to a multi-block separable convex minimization model whose objective function is in form of more than two functions without coupled variables, it was recently shown that the convergence is not guaranteed. This fact urges to develop efficient algorithms that can preserve completely the numerical advantages of the direct extension of ADMM but with guaranteed convergence. One way to do so is to correct the output of the direct extension of ADMM slightly via a simple correction step (in the sense that it is completely free from step-size computing and its step size is bounded away from zero). In this paper, we propose a prototype algorithm in this framework. This prototype algorithm includes the augmented Lagrangian method (ALM), the original ADMM and the direct extension of ADMM as special cases. A unified and easily checkable condition to ensure the convergence of this prototype algorithm is given. The convergence failure of the direct extension of ADMM can be simply illustrated as that it does not satisfy this condition, while as a comparison, both ALM and ADMM do. The analysis is conducted in the variational inequality context. Based on this prototype algorithm, we propose a class of specific ADMM-based algorithms which are easy to implement. Theoretically, we show the contraction property, prove the global convergence and establish the worst-case convergence rate measured by the iteration complexity for the prototype algorithm. Numerically, we show by an image decomposition problem the efficiency of this class of algorithms. In particular, an important feature of this new class of algorithms is that they could easily outperform the direct extension of ADMM for the cases where the latter is indeed convergent, which is rare among existing methods of the same kind.

Keywords: onvex optimization, Alternating direction method of multipliers, Variational inequalities, Splitting methods, Contraction, Convergence rate

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 03/28/2014
Entry Accepted: 03/28/2014
Entry Last Modified: 04/05/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society