-

 

 

 




Optimization Online





 

Fast Alternating Linearization Methods for Minimizing the Sum of Two Convex Functions

Donald Goldfarb (goldfarb***at***columbia.edu)
Shiqian Ma (sm2756***at***columbia.edu)
Katya Scheinberg (katyas***at***lehigh.edu)

Abstract: We present in this paper first-order alternating linearization algorithms based on an alternating direction augmented Lagrangian approach for minimizing the sum of two convex functions. Our basic methods require at most $O(1/\epsilon)$ iterations to obtain an $\epsilon$-optimal solution, while our accelerated (i.e., fast) versions of them require at most $O(1/\sqrt{\epsilon})$ iterations, with little change in the computational effort required at each iteration. For both types of methods, we present one algorithm that requires both functions to be smooth with Lipschitz continuous gradients and one algorithm that needs only one of the functions to be so. Algorithms in this paper are Gauss-Seidel type methods, in contrast to the ones proposed by Goldfarb and Ma in [21] where the algorithms are Jacobi type methods. Numerical results are reported to support our theoretical conclusions and demonstrate the practical potential of our algorithms.

Keywords: Convex Optimization, Variable Splitting, Alternating Linearization Method, Alternating Direction Method, Augmented Lagrangian Method, Proximal Point Algorithm, Optimal Gradient Method, Gauss-Seidel Method, Peaceman-Rachford Method, Robust Principal Component Analysis

Category 1: Convex and Nonsmooth Optimization

Citation: Technical Report, Department of IEOR, Columbia University, 2009

Download: [PDF]

Entry Submitted: 12/22/2009
Entry Accepted: 12/22/2009
Entry Last Modified: 06/14/2012

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society