-

 

 

 




Optimization Online





 

An example of slow convergence for Newton's method on a function with globally Lipschitz continuous Hessian

Coralia Cartis(coralia.cartis***at***ed.ac.uk)
Nicholas I.M. Gould(nick.gould***at***stfc.ac.uk)
Philippe L. Toint(philippe.toint***at***unamur.be)

Abstract: An example is presented where Newton's method for unconstrained minimization is applied to find an $\epsilon$-approximate first-order critical point of a smooth function and takes a multiple of $\epsilon^{-2}$ iterations and function evaluations to terminate, which is as many as the steepest-descent method in its worst-case. The novel feature of the proposed example is that the objective function has a globally Lipschitz-continuous Hessian, while a previous example published by the same authors only ensured this critical property along the path of iterates, which is impossible to verify a priori.

Keywords: Nonlinear optimization, evaluation complexity, Newton's method

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Category 2: Nonlinear Optimization (Nonlinear Systems and Least-Squares )

Citation: @techreport{CartGoulToin13b, author = {C. Cartis and N. I. M. Gould and Ph. L. Toint}, title = {An example of slow convergence for {N}ewton's method on a function with globally {L}ipschitz continuous {H}essian}, institution = NAXYS, address = {University of Namur, Namur, Belgium}, number = {naXys-03-2013}, year = 2013}

Download: [PDF]

Entry Submitted: 05/05/2013
Entry Accepted: 05/05/2013
Entry Last Modified: 05/05/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society