Optimization Online


A Simple Nearly-Optimal Restart Scheme For Speeding-Up First-Order Methods

James Renegar (renegar***at***cornell.edu)
Benjamin Grimmer (bdg79***at***cornell.edu)

Abstract: We present a simple scheme for restarting first-order methods for convex optimization problems. Restarts are made based only on achieving specified decreases in objective values, the specified amounts being the same for all optimization problems. Unlike existing restart schemes, the scheme makes no attempt to learn parameter values characterizing the structure of an optimization problem, nor does it require any special information that would not be available in practice (unless the first-order method chosen to be employed in the scheme itself requires special information). As immediate corollaries to the main theorems, we show that when some well-known first-order methods are employed in the scheme, the resulting complexity bounds are nearly optimal for particular -- yet quite general -- classes of problems.

Keywords: restart, first-order methods, convex optimization

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: arXiv:1803.00151

Download: [PDF]

Entry Submitted: 02/27/2018
Entry Accepted: 02/27/2018
Entry Last Modified: 03/03/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society