-

 

 

 




Optimization Online





 

Sharpness, Restart and Acceleration.

Vincent Roulet(vincent.roulet***at***inria.fr)
Alexandre d'Aspremont(aspremon***at***ens.fr)

Abstract: The Lojasievicz inequality shows that sharpness bounds on the minimum of convex optimization problems hold almost generically. Here, we show that sharpness directly controls the performance of restart schemes. The constants quantifying sharpness are of course unobservable, but we show that optimal restart strategies are fairly robust, and searching for the best scheme only increases the complexity by a logarithmic factor compared to the optimal bound. Overall then, restart schemes generically accelerate accelerated methods.

Keywords:

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation:

Download: [PDF]

Entry Submitted: 05/30/2017
Entry Accepted: 05/30/2017
Entry Last Modified: 05/30/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society