Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms
Charles Audet (charles.Audetgerad.ca)
Abstract: The mesh adaptive direct search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations of MADS generate 2n trial points at each iteration, where n is the number of variables in the optimization problem. The objective of the present work is to dynamically reduce that number. We present an algorithmic framework that reduces the number of simulations to exactly n+1, without impacting the theoretical guarantees from the convergence analysis. Numerical experiments are conducted for several different contexts; the results suggest that these strategies allow the new algorithms to reach a better solution with fewer function evaluations.
Keywords: Mesh Adaptive Direct Search (MADS) algorithms, derivative-free optimization, positive spanning sets, nonsmooth optimization.
Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )
Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )
Citation: SIAM Journal on Optimization, 24(2), p. 621-642, 2014.
Entry Submitted: 10/12/2012
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|