Optimization Online


An optimal first order method based on optimal quadratic averaging

Dmitriy Drusvyatskiy (ddrusv***at***uw.edu)
Maryam Fazel (mfazel***at***uw.edu)
Scott Roy (scottroy***at***uw.edu)

Abstract: In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. Motivated by their work, we propose a close variant that iteratively maintains a quadratic global under-estimator of the objective function, whose minimal value approaches the true minimum at an optimal rate. The resulting intuitive scheme comes equipped with a natural stopping criterion and can be numerically accelerated by using accumulated information.

Keywords: first order method, accelerated gradient method, convex quadratic

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: 20 pages

Download: [PDF]

Entry Submitted: 04/22/2016
Entry Accepted: 04/22/2016
Entry Last Modified: 04/25/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society