-

 

 

 




Optimization Online





 

General Convergence Rates Follow From Specialized Rates Assuming Growth Bounds

Benjamin Grimmer(bdg79***at***cornell.edu)

Abstract: Often in the analysis of first-order methods, assuming the existence of a quadratic growth bound (a generalization of strong convexity) facilitates much stronger convergence analysis. Hence the analysis is done twice, once for the general case and once for the growth bounded case. We give a meta-theorem for deriving general convergence rates from those assuming a growth lower bound. Applying this simple but conceptually powerful tool to the proximal point method, the subgradient method, and the bundle method immediately recovers their known convergence rates for general convex optimization problems from their specialized rates. Future works studying first-order methods can assume growth bounds for the sake of analysis without hampering the generality of the results. Our results can be applied to lift any rate based on a Hölder growth bound. As a consequence, guarantees for minimizing sharp functions imply guarantees for both general functions and those satisfying quadratic growth.

Keywords: First-Order Method, Convergence, Quadratic Growth

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation:

Download: [PDF]

Entry Submitted: 05/15/2019
Entry Accepted: 05/16/2019
Entry Last Modified: 05/15/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society