Optimization Online


A Newton-CG Algorithm with Complexity Guarantees for Smooth Unconstrained Optimization

Clément W. Royer (croyer2***at***wisc.edu)
Michael O'Neill (moneill***at***cs.wisc.edu)
Stephen J. Wright (swright***at***cs.wisc.edu)

Abstract: We consider minimization of a smooth nonconvex objective function using an iterative algorithm based on Newton's method and linear conjugate gradient, with explicit detection and use of negative curvature directions for the Hessian of the objective function. The algorithm tracks Newton-conjugate gradient procedures developed in the 1980s closely, but includes enhancements that allow worst-case complexity results to be proved for convergence to points that satisfy approximate first-order and second-order optimality conditions. The complexity results match the best known results in the literature for second-order methods.

Keywords: smooth nonconvex optimization, Newton's method, conjugate gradient method, optimality conditions, worst-case complexity.

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Technical Report, March 2018.

Download: [PDF]

Entry Submitted: 03/07/2018
Entry Accepted: 03/07/2018
Entry Last Modified: 11/12/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society