Optimization Online


Exact and Inexact Subsampled Newton Methods for Optimization

Raghu Bollapragada(vijayabollapragada2014***at***u.northwestern.edu)
Richard Byrd(richard***at***cs.colorado.edu)
Jorge Nocedal(j-nocedal***at***northwestern.edu)

Abstract: The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an inexact Newton method that solves linear systems approximately using the conjugate gradient (CG) method, and that samples the Hessian and not the gradient (the gradient is assumed to be exact). We provide a complexity analysis for this method based on the properties of the CG iteration and the quality of the Hessian approximation, and compare it with a method that employs a stochastic gradient iteration instead of the CG method. We report preliminary numerical results that illustrate the performance of inexact subsampled Newton methods on machine learning applications based on logistic regression.

Keywords: Machine Learning, Subsampling, Newton's Method

Category 1: Nonlinear Optimization

Category 2: Stochastic Programming

Category 3: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 09/27/2016
Entry Accepted: 09/27/2016
Entry Last Modified: 09/27/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society