-

 

 

 




Optimization Online





 

An Accelerated Inexact Dampened Augmented Lagrangian Method for Linearly-Constrained Nonconvex Composite Optimization Problems

Weiwei Kong(wwkong92***at***gmail.com)
Renato D.C. Monteiro(monteiro***at***isye.gatech.edu)

Abstract: This paper proposes and analyzes an accelerated inexact dampened augmented Lagrangian (AIDAL) method for solving linearly-constrained nonconvex composite optimization problems. Each iteration of the AIDAL method consists of: (i) inexactly solving a dampened proximal augmented Lagrangian (AL) subproblem by calling an accelerated composite gradient (ACG) subroutine; (ii) applying a dampened and under-relaxed Lagrange multiplier update; and (iii) using a novel test to check whether the penalty parameter of the AL function should be increased. Under several mild assumptions involving the dampening factor and the under-relaxation constant, it is shown that the AIDAL method generates an approximate stationary point of the constrained problem in O(p^(-5/2) log p^(-1)) iterations of the ACG subroutine, for a given tolerance p > 0. Numerical experiments are also given to show the computational efficiency of the proposed method.

Keywords: inexact proximal augmented Lagrangian method, linearly constrained smooth nonconvex composite programs, accelerated first-order methods, iteration-complexity

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation:

Download: [PDF]

Entry Submitted: 10/21/2021
Entry Accepted: 10/21/2021
Entry Last Modified: 10/21/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society