-

 

 

 




Optimization Online





 

Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems

Jane Ye (janeye***at***uvic.ca)
Xiaoming Yuan (xmyuan***at***hku.hk)
Shangzhi Zeng (zengsz***at***connect.hku.hk)
Jin Zhang (zhangj9***at***sustech.edu.cn)

Abstract: We understand linear convergence of some first-order methods such as the proximal gradient method (PGM), the proximal alternating linearized minimization (PALM) algorithm and the randomized block coordinate proximal gradient method (R-BCPGM) for minimizing the sum of a smooth convex function and a nonsmooth convex function from a variational analysis perspective. We introduce a new analytic framework based on some theories on variational analysis such as the error bound/calmness/metric subregularity/bounded metric subregularity. This variational analysis perspective enables us to provide some concrete sufficient conditions for checking linear convergence and applicable approaches for calculating linear convergence rates of these first-order methods for a class of structured convex problems where the smooth part of the objective function is a composite of a smooth strongly convex function and a linear function. We show that these conditions are satisfied automatically, and the modulus for the calmness/metric subregularity is practically computable in some important applications such as in LASSO, fused LASSO and group LASSO. Subsequently, the linear convergence of the first-order methods for these important applications is automatically guaranteed and the convergence rates can be calculated. Therefore, the new perspective enables us to improve some existing results and obtain novel results unknown in the literature. In particular, we improve the result on the linear convergence of the PGM and the PALM for the structured convex problem with a computable error bound estimation. Also for the R-BCPGM for the structured convex problem, we prove that the linear convergence is ensured when the nonsmooth part of the objective function is the group LASSO regularizer.

Keywords: Metric subregularity, calmness, proximal gradient method, proximal alternating linearized minimization, randomized block coordinate proximal gradient method, linear convergence, variational analysis, machine learning, statistics

Category 1: Convex and Nonsmooth Optimization

Citation: First version: February 2018 / Second version: October 2018 / Third version: October 2019

Download: [PDF]

Entry Submitted: 10/23/2018
Entry Accepted: 10/23/2018
Entry Last Modified: 10/06/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society