-

 

 

 




Optimization Online





 

Error bounds, quadratic growth, and linear convergence of proximal methods

Dmitriy Drusvyatskiy(ddrusv***at***uw.edu)
Adrian S. Lewis(aslewis***at***orie.cornell.edu )

Abstract: We show that the the error bound property, postulating that the step lengths of the proximal gradient method linearly bound the distance to the solution set, is equivalent to a standard quadratic growth condition. We exploit this equivalence in an analysis of asymptotic linear convergence of the proximal gradient algorithm for structured problems, which lack strong convexity. More generally still, we analyze local linear convergence guarantees of a proximal method for minimizing compositions of convex functions with smooth mappings.

Keywords: proximal point method, error bound, subdifferential, quadratic growth, tilt-stability

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation:

Download: [PDF]

Entry Submitted: 02/23/2016
Entry Accepted: 02/23/2016
Entry Last Modified: 02/23/2016

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society