Optimization Online


Subsampled Inexact Newton methods for minimizing large sums of convex functions

S. Bellavia (stefania.bellavia***at***unifi.it)
N. Krejic (natasak***at***uns.ac.rs)
N. Krklec Jerinkic (natasa.krklec***at***dmi.uns.ac.rs)

Abstract: This paper deals with the minimization of large sum of convex functions by Inexact Newton (IN) methods employing subsampled Hessian approximations. The Conjugate Gradient method is used to compute the inexact Newton step and global convergence is enforced by a nonmonotone line search procedure. The aim is to obtain methods with affordable costs and fast convergence. Assuming strictly convex functions, a set of rules for the forcing parameters and subsample sizes are derived that ensure local linear/superlinear convergence of the proposed methods. The random choice of the Hessian subsample is also investigated and bounds ensuring a good approximation of the true Hessian with some high probability are provided. For such Hessian approximations and suitable forcing terms convergence in the mean square, both for finite and infinite sums of functions, is proved. Finally, convergence of IN methods is investigated in the case of sum of convex function and strongly convex objective function. Numerical results on well known binary classification problems are also given. Adaptive strategies for selecting forcing terms and Hessian subsample size, streaming out of the theoretical analysis, are employed and the numerical results showed that they yield effective IN methods.

Keywords: Inexact Newton, subsampled Hessian, superlinear convergence, global convergence, mean square convergence

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 01/23/2018
Entry Accepted: 01/23/2018
Entry Last Modified: 05/01/2019

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society