  


Convergence analysis of an accelerated stochastic ADMM with larger stepsizes
Jianchao Bai (bjc1987163.com) Abstract: In this paper, we develop an accelerated stochastic Alternating Direction Method of Multipliers (ADMM) for solving the structured convex optimization problem whose objective is the sum of smooth and nonsmooth terms. The proposed algorithm uses the ideas of both ADMM and accelerated stochastic gradient methods, and the involved dual variables are updated symmetrically with a larger stepsize region than the socalled $(0, \frac{\sqrt{5}+1}{2})$. To simplify computation of the nonsmooth subproblem, we add an nonnegative proximal term to transform the subproblem into a proximal mapping. By a variational analysis, we show that the proposed stochastic algorithm converges in expectation with the worstcase $\C{O}(1/T)$ convergence rate, where $T$ denotes the number of outer iterations. Convergence analysis of a special stochastic augmented Lagrange method and 3block extensions of the proposed algorithm are also investigated. Numerical experiments on testing a bigdata problem in machine learning verify that the proposed method is very promising. Keywords: convex optimization, stochastic ADMM, symmetric update, larger stepsize, proximal term, complexity Category 1: Convex and Nonsmooth Optimization Category 2: Global Optimization (Stochastic Approaches ) Citation: Download: [PDF] Entry Submitted: 01/01/2020 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  