  


DADMM: A CommunicationEfficient Distributed Algorithm For Separable Optimization
Joao Mota(joaomotacmu.edu) Abstract: We propose a distributed algorithm, named DADMM, for solving separable optimization problems in networks of interconnected nodes or agents. In a separable optimization problem, the cost function is the sum of all the agents' private cost functions, and the constraint set is the intersection of all the agents' private constraint sets. We require the private cost function and constraint set of a node to be known by that node only, during and before the execution of the algorithm. The application of our algorithm is illustrated with problems from signal processing and control, namely average consensus, compressed sensing, and support vector machines. It is well known that communicating in distributed environments is the most energy/timedemanding operation. Thus, algorithms using less communications are more prone to make networks live longer, e.g., sensor networks, or to execute faster, e.g., in supercomputing platforms. Through simulations for several network types and problems, we show that our algorithm requires less communications than the stateoftheart algorithms. Keywords: Distributed algorithms, alternating direction method of multipliers, consensus, compressed sensing, machine learning, sensor networks Category 1: Network Optimization Category 2: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Submitted to IEEE Trans. Sig. Proc., February 13, 2012 Download: [PDF] Entry Submitted: 02/13/2012 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  