-

 

 

 




Optimization Online





 

A fundamental proof to convergence analysis of alternating direction method of multipliers for weakly convex optimization

Tao Zhang(szw2628***at***sina.com)
Shen Zhengwei(ijwmip***at***ustb.edu.cn)

Abstract: The convergence analysis of the alternating direction method of multipliers (ADMM) methods to convex/nonconvex combinational optimization have been well established in literature. Consider the extensive applications of weakly convex function in signal processing and machine learning(e.g. \textit{Special issue: DC-Theory, Algorithms and Applications, Mathematical Programming, 169(1):1-336,2018}), in this paper, we investigate the convergence analysis of ADMM algorithm to the strongly and weakly convex combinational optimization(SWCCO) problem. Specifically, we firstly build the convergence of the iterative sequences of the SWCCO-ADMM under a mild regularity condition; then we establish the $o(1/k)$ sublinear convergence rate to SWCCO-ADMM algorithm using the same conditions and the linear convergence rate by imposing gradient Lipschitz continuous condition to the objective function. The techniques used for convergence analysis in this paper is fundamental, and we accomplish the global convergence without using the Kurdyka-$\L$ojasiewicz (KL) inequality, which is common but complex in the proof of nonconvex ADMM

Keywords: nonconvex optimization; weakly convex function; ADMM; sublinear and linear convergence

Category 1: Convex and Nonsmooth Optimization (Generalized Convexity/Monoticity )

Category 2: Global Optimization

Citation:

Download: [PDF]

Entry Submitted: 08/13/2018
Entry Accepted: 08/13/2018
Entry Last Modified: 08/13/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society