Optimization Online


An Investigation of Newton-Sketch and Subsampled Newton Methods

Albert S. Berahas (albertberahas***at***u.northwestern.edu)
Raghu Bollapragada (raghu.bollapragada***at***u.northwestern.edu)
Jorge Nocedal (j-nocedal***at***northwestern.edu)

Abstract: The concepts of sketching and subsampling have recently received much attention by the optimization and statistics communities. In this paper, we study Newton-Sketch and Subsampled Newton (SSN) methods for the finite-sum optimization problem. We consider practical versions of the two methods in which the Newton equations are solved approximately using the conjugate gradient (CG) method or a stochastic gradient iteration. We establish new complexity results for the SSN-CG method that exploit the spectral properties of CG. Controlled numerical experiments compare the relative strengths of Newton-Sketch and SSN methods and show that for many finite-sum problems, they are far more efficient than SVRG, a popular first-order method.

Keywords: Subsampling, Sketching, Machine Learning, Newton Methods

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Nonlinear Optimization

Category 3: Stochastic Programming

Citation: Northwestern University, May 2017

Download: [PDF]

Entry Submitted: 05/17/2017
Entry Accepted: 05/17/2017
Entry Last Modified: 07/24/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society