-

 

 

 




Optimization Online





 

Convergence analysis of an accelerated stochastic ADMM with larger stepsizes

Jianchao Bai (bjc1987***at***163.com)

Abstract: In this paper, we develop an accelerated stochastic Alternating Direction Method of Multipliers (ADMM) for solving the structured convex optimization problem whose objective is the sum of smooth and nonsmooth terms. The proposed algorithm uses the ideas of both ADMM and accelerated stochastic gradient methods, and the involved dual variables are updated symmetrically with a larger stepsize region than the so-called $(0, \frac{\sqrt{5}+1}{2})$. To simplify computation of the nonsmooth subproblem, we add an nonnegative proximal term to transform the subproblem into a proximal mapping. By a variational analysis, we show that the proposed stochastic algorithm converges in expectation with the worst-case $\C{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Convergence analysis of a special stochastic augmented Lagrange method and 3-block extensions of the proposed algorithm are also investigated. Numerical experiments on testing a big-data problem in machine learning verify that the proposed method is very promising.

Keywords: convex optimization, stochastic ADMM, symmetric update, larger stepsize, proximal term, complexity

Category 1: Convex and Nonsmooth Optimization

Category 2: Global Optimization (Stochastic Approaches )

Citation:

Download: [PDF]

Entry Submitted: 01/01/2020
Entry Accepted: 01/01/2020
Entry Last Modified: 02/12/2020

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society