-

 

 

 




Optimization Online





 

Improving the Flexibility and Robustness of Model-Based Derivative-Free Optimization Solvers

Coralia Cartis (cartis***at***maths.ox.ac.uk)
Jan Fiala (jan.fiala***at***nag.co.uk)
Benjamin Marteau (benjamin.marteau***at***nag.co.uk)
Lindon Roberts (lindon.roberts***at***maths.ox.ac.uk)

Abstract: We present DFO-LS, a software package for derivative-free optimization (DFO) for nonlinear Least-Squares (LS) problems, with optional bound constraints. Inspired by the Gauss-Newton method, DFO-LS constructs simplified linear regression models for the residuals. DFO-LS allows flexible initialization for expensive problems, whereby it can begin making progress from as few as two objective evaluations. Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation. DFO-LS has improved robustness to noise, allowing sample averaging, the construction of regression-based models, and multiple restart strategies together with an auto-detection mechanism. Our extensive numerical experimentation shows that restarting the solver when stagnation is detected is a cheap and effective mechanism for achieving robustness, with superior performance over both sampling and regression techniques. We also present our package Py-BOBYQA, a Python implementation of BOBYQA (Powell, 2009), which also implements robustness to noise strategies. Our numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems. In our comparisons, we introduce a new adaptive measure of accuracy for the data profiles of noisy functions that strikes a balance between measuring the true and the noisy objective improvement.

Keywords: derivative-free optimization, nonlinear least-squares, trust region methods, stochastic optimization, mathematical software, performance evaluation.

Category 1: Nonlinear Optimization

Citation: Technical Report, Mathematical Institute, Oxford University, 2018.

Download: [PDF]

Entry Submitted: 03/30/2018
Entry Accepted: 03/31/2018
Entry Last Modified: 05/31/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society