-

 

 

 




Optimization Online





 

Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods

Albert S. Berahas (albertberahas***at***u.northwestern.edu)
Richard H. Byrd (richard.byrd***at***colorado.edu)
Jorge Nocedal (j-nocedal***at***northwestern.edu)

Abstract: This paper presents a finite difference quasi-Newton method for the minimization of noisy functions. The method takes advantage of the scalability and power of BFGS updating, and employs an adaptive procedure for choosing the differencing interval h based on the noise estimation techniques of Hamming (2012) and Moré and Wild (2011). This noise estimation procedure and the selection of h are inexpensive but not always accurate, and to prevent failures the algorithm incorporates a recovery mechanism that takes appropriate action in the case when the line search procedure is unable to produce an acceptable point. A novel convergence analysis is presented that considers the effect of a noisy line search procedure. Numerical experiments comparing the method to a function interpolating trust region method are presented.

Keywords: derivative-free optimization, nonlinear optimization, stochastic optimization

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Category 3: Nonlinear Optimization (Other )

Citation: Northwestern University, March 2018

Download: [PDF]

Entry Submitted: 03/27/2018
Entry Accepted: 03/27/2018
Entry Last Modified: 01/08/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society