  


Blockwise Alternating Direction Method of Multipliers with Gaussian Back Substitution for Multipleblock Convex Programming
Xiaoling Fu(fuxlnjuhotmail.com) Abstract: We consider the linearly constrained convex minimization model with a separable objective function which is the sum of m functions without coupled variables, and discuss how to design an efficient algorithm based on the fundamental technique of splitting the augmented Lagrangian method (ALM). Our focus is the specific bigdata scenario where m is huge. A pretreatment on the original data is to regroup the m functions in the objective and the corresponding m variables as t subgroups, where $t$ is a handleable number (usually t>=3 but much smaller than m). To tackle the regrouped model with t blocks of functions and variables, some existing splitting methods in the literature are applicable. We concentrate on the application of the alternating direction method of multiplier with Gaussian back substitution (ADMMGBS) whose efficiency and scalability have been well verified in the literature. The blockwise ADMMGBS is thus resulted and named when the ADMMGBS is applied to solve the tblock regrouped model. To alleviate the difficulty of the resulting ADMMGBS subproblems, each of which may still require minimizing more than one function with coupled variables, we suggest further decomposing these subproblems but proximally regularizing these further decomposed subproblems to ensure the convergence. With this further decomposition, each of the resulting subproblems only requires handling one function in the original objective plus a simple quadratic term; it thus may be very easy for many concrete applications where the functions in the objective have some specific properties. Moreover, these further decomposed subproblems can be solved in parallel, making it possible to handle bigdata by highly capable computing infrastructures. Consequently, a splitting version of the blockwise ADMMGBS, is proposed for the particular bigdata scenario. The implementation of this new algorithm is suitable for a centralizeddistributed computing system, where the decomposed subproblems of each block can be computed in parallel by a distributedcomputing infrastructure and the blocks are updated by a centralizedcomputing station. For the new algorithm, we prove its convergence and establish its worstcase convergence rate measured by the iteration complexity. Two refined versions of this new algorithm with iteratively calculated step sizes and linearized subproblems are also proposed, respectively. Keywords: Convex programming, Alternating direction method of multipliers, Big data, Distributed computing, Centralized computing, Splitting methods, Convergence rate Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Download: [PDF] Entry Submitted: 09/16/2014 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  