Optimization Online


Oracle Complexity of Second-Order Methods for Smooth Convex Optimization

Ohad Shamir(ohad.shamir***at***weizmann.ac.il)
Ron Shiff(ron.shiff***at***weizmann.ac.il)

Abstract: Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive (to the best of our knowledge) the first algorithm-independent lower bounds for such methods. These bounds shed light on the limits of attainable performance with second-order methods, and in which cases they can or cannot require less iterations than gradient-based methods, whose oracle complexity is much better understood.

Keywords: oracle complexity, second order methods, lower bounds

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Nonlinear Optimization

Citation: Weizmann Institute, May 2017

Download: [PDF]

Entry Submitted: 05/22/2017
Entry Accepted: 05/23/2017
Entry Last Modified: 05/22/2017

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society