Optimization Online


Analysis of the BFGS Method with Errors

Yuchen Xie(ycxie***at***u.northwestern.edu)
Richard Byrd(Richard.Byrd***at***colorado.edu)
Jorge Nocedal(j-nocedal***at***northwestern.edu)

Abstract: The classical convergence analysis of quasi-Newton methods assumes that the function and gradients employed at each iteration are exact. In this paper, we consider the case when there are (bounded) errors in both computations and establish conditions under which a slight modification of the BFGS algorithm with an Armijo-Wolfe line search converges to a neighborhood of the solution that is determined by the size of the errors. One of our results is an extension of the analysis presented in Byrd, R. H., & Nocedal, J. (1989), which establishes that, for strongly convex functions, a fraction of the BFGS iterates are good iterates. We present numerical results illustrating the performance of the new BFGS method in the presence of noise.

Keywords: BFGS method, convergence analysis

Category 1: Nonlinear Optimization (Unconstrained Optimization )


Download: [PDF]

Entry Submitted: 01/25/2019
Entry Accepted: 01/25/2019
Entry Last Modified: 01/25/2019

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society