  


A Flexible ADMM Algorithm for Big Data Applications
Daniel Robinson(daniel.p.robinsonjhu.edu) Abstract: We present a flexible Alternating Direction Method of Multipliers (FADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into $n \geq 2$ blocks, subject to (nonseparable) linear equality constraints. The FADMM algorithm uses a \emph{GaussSeidel} scheme to update blocks of variables, and a regularization term is added to each of the subproblems arising within FADMM. We prove, under common assumptions, that FADMM is globally convergent. We also present a special case of FADMM that is \emph{partially parallelizable}, which makes it attractive in a big data setting. In particular, we partition the data into groups, so that each group consists of multiple blocks of variables. By applying FADMM to this partitioning of the data, and using a specific regularization matrix, we obtain a hybrid ADMM (HADMM) algorithm: the grouped data is updated in a GaussSeidel fashion, and the blocks within each group are updated in a Jacobi manner. Convergence of HADMM follows directly from the convergence properties of FADMM. Also, a special case of HADMM can be applied to functions that are convex, rather than strongly convex. We present numerical experiments to demonstrate the practical advantages of this algorithm. Keywords: Alternating Direction Method of Multipliers; convex optimization; GaussSeidel; Jacobi; regularization; separable function Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Download: [PDF] Entry Submitted: 03/23/2015 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  