-

 

 

 




Optimization Online





 

Fast Multiple Splitting Algorithms for Convex Optimization

Donald Goldfarb (goldfarb***at***columbia.edu)
Shiqian Ma (sm2756***at***columbia.edu)

Abstract: We present in this paper two different classes of general $K$-splitting algorithms for solving finite-dimensional convex optimization problems. Under the assumption that the function being minimized has a Lipschitz continuous gradient, we prove that the number of iterations needed by the first class of algorithms to obtain an $\epsilon$-optimal solution is $O(1/\epsilon)$. The algorithms in the second class are accelerated versions of those in the first class, where the complexity result is improved to $O(1/\sqrt{\epsilon})$ while the computational effort required at each iteration is almost unchanged. To the best of our knowledge, the complexity results presented in this paper are the first ones of this type that have been given for splitting and alternating direction type methods. Moreover, all algorithms proposed in this paper are parallelizable, which makes them particularly attractive for solving certain large-scale problems.

Keywords: Convex Optimization, Variable Splitting, Alternating Direction Method, Alternating Linearization Method, Complexity Theory, Decomposition, Smoothing Techniques, Parallel Computing, Proximal Point Algorithm, Optimal Gradient Method

Category 1: Convex and Nonsmooth Optimization

Citation: Technical Report, Department of IEOR, Columbia University, 2009

Download: [PDF]

Entry Submitted: 12/22/2009
Entry Accepted: 12/22/2009
Entry Last Modified: 03/18/2011

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society