Optimization Online


The proximal alternating direction method of multipliers in the nonconvex setting: convergence analysis and rates

Radu Ioan Bot(radu.bot***at***univie.ac.at)
Dang-Khoa Nguyen(dang-khoa.nguyen***at***univie.ac.at)

Abstract: We propose two numerical algorithms for minimizing the sum of a smooth function and the composition of a nonsmooth function with a linear operator in the fully nonconvex setting. The iterative schemes are formulated in the spirit of the proximal and, respectively, proximal linearized alternating direction method of multipliers. The proximal terms are introduced through variable metrics, which facilitates the derivation of proximal splitting algorithms for nonconvex complexly structured optimization problems as particular instances of the general schemes. Convergence of the iterates to a KKT point of the objective function is proved under mild conditions on the sequence of variable metrics and by assuming that a regularization of the associated augmented Lagrangian has the Kurdyka-Lojasiewicz property. If the augmented Lagrangian has the Lojasiewicz property, then convergence rates of both augmented Lagrangian and iterates are derived.

Keywords: nonconvex complexly structured optimization problems, alternating direction method of multipliers, proximal splitting algorithms, variable metric, convergence analysis, convergence rates, Kurdyka-Lojasiewicz property, Lojasiewicz exponent

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )


Download: [PDF]

Entry Submitted: 01/06/2018
Entry Accepted: 01/06/2018
Entry Last Modified: 01/06/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society