Optimization Online


Faster Alternating Direction Method of Multipliers with a Worst-case O(1/n^2) Convergence Rate

Wenyi Tian(twymath***at***gmail.com)
Xiaoming Yuan(xmyuan***at***hkbu.edu.hk)

Abstract: The alternating direction method of multipliers (ADMM) is being widely used for various convex programming models with separable structures arising in specifically many scientific computing areas. The ADMM's worst-case O(1/n) convergence rate measured by the iteration complexity has been established in the literature when its penalty parameter is a constant, where n is the iteration counter. Research on ADMM's worst-case O(1/n^2) convergence rate, however, is still in its infancy. In this paper, we suggest applying a rule proposed recently by Chambolle and Pock to iteratively update the penalty parameter and show that ADMM with this adaptive penalty parameter has a worst-case O(1/n^2) convergence rate. Without strong convexity requirement on the objective function, our assumptions on the model are mild and can be satisfied by some representative applications. We test the LASSO model and numerically verify the significant acceleration effectiveness of the faster ADMM with a worst-case O(1/n^2) convergence rate. Moreover, the faster ADMM is more user-favorable than the ADMM with a constant penalty parameter in senses of that it can pursue solutions with very high accuracy and that it is not sensitive to the initial value of the penalty parameter.

Keywords: Convex programming, Alternating direction method of multipliers, Convergence rate, Acceleration, First order methods

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 07/17/2016
Entry Accepted: 07/17/2016
Entry Last Modified: 07/17/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society