-

 

 

 




Optimization Online





 

First-Order Algorithms for Convex Optimization with Nonseparate Objective and Coupled Constraints

Xiang Gao(gaoxx460***at***umn.edu)
Shuzhong Zhang(zhangs***at***umn.edu)

Abstract: In this paper we consider a block-structured convex optimization model, where in the objective the block-variables are nonseparable and they are further linearly coupled in the constraint. For the 2-block case, we propose a number of first-order algorithms to solve this model. First, the alternating direction method of multipliers (ADMM) is extended, assuming that it is easy to optimize the augmented Lagrangian function with one block of variables at each time while fixing the other block. We prove that O(1/t) iteration complexity bound holds under suitable conditions, where t is the number of iterations. If the subroutines of the ADMM cannot be implemented, then we propose new alternative algorithms to be called alternating proximal gradient method of multipliers (APGMM), alternating gradient projection method of multipliers(AGPMM), and the hybrids thereof. Under suitable conditions, the O(1/t) iteration complexity bound is shown to hold for all the newly proposed algorithms. Finally, we extend the analysis for the ADMM to the general multi-block case.

Keywords: First-Order Algorithms, ADMM, Proximal Gradient Method, Convex Optimization, Iteration Complexity

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: Technical report, Department of Industrial and Systems Engineering, University of Minnesota, Minneapolis, MN 55455, May/2015

Download: [PDF]

Entry Submitted: 05/09/2015
Entry Accepted: 05/10/2015
Entry Last Modified: 05/09/2015

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society