-

 

 

 




Optimization Online





 

Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization

Hao-Jun Shi(hjmshi***at***u.northwestern.edu)
Yuchen Xie(ycxie***at***u.northwestern.edu)
Melody Xuan(qxuan***at***u.northwestern.edu)
Jorge Nocedal(j-nocedal***at***northwestern.edu)

Abstract: A common approach for minimizing a smooth nonlinear function is to employ finite-difference approximations to the gradient. While this can be easily performed when no error is present within the function evaluations, when the function is noisy, the optimal choice requires information about the noise level and higher-order derivatives of the function, which is often unavailable. Given the noise level of the function, we propose a bisection search for finding a finite-difference interval for any finite-difference scheme that balances the truncation error, which arises from the error in the Taylor series approximation, and the measurement error, which results from noise in the function evaluation. Our procedure produces near-optimal estimates of the finite-difference interval at low cost without knowledge of the higher-order derivatives. We show its numerical reliability and accuracy on a set of test problems. When combined with L-BFGS, we obtain a robust method for minimizing noisy black-box functions, as illustrated on a subset of synthetically noisy unconstrained CUTEst problems.

Keywords: derivative-free optimization, noisy optimization, zeroth-order optimization, nonlinear optimization, finite differences

Category 1: Nonlinear Optimization

Citation:

Download: [PDF]

Entry Submitted: 10/14/2021
Entry Accepted: 10/14/2021
Entry Last Modified: 10/14/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society