-

 

 

 




Optimization Online





 

Reducing the Number of Function Evaluations in Mesh Adaptive Direct Search Algorithms

Charles Audet (charles.Audet***at***gerad.ca)
Andrea Ianni (10ianni***at***gmail.com)
S├ębastien Le Digabel (sebastien.Le.Digabel***at***gerad.ca)
Christophe Tribes (christophe.tribes***at***polymtl.ca)

Abstract: The mesh adaptive direct search (MADS) class of algorithms is designed for nonsmooth optimization, where the objective function and constraints are typically computed by launching a time-consuming computer simulation. Each iteration of a MADS algorithm attempts to improve the current best-known solution by launching the simulation at a finite number of trial points. Common implementations of MADS generate 2n trial points at each iteration, where n is the number of variables in the optimization problem. The objective of the present work is to dynamically reduce that number. We present an algorithmic framework that reduces the number of simulations to exactly n+1, without impacting the theoretical guarantees from the convergence analysis. Numerical experiments are conducted for several different contexts; the results suggest that these strategies allow the new algorithms to reach a better solution with fewer function evaluations.

Keywords: Mesh Adaptive Direct Search (MADS) algorithms, derivative-free optimization, positive spanning sets, nonsmooth optimization.

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation: SIAM Journal on Optimization, 24(2), p. 621-642, 2014.

Download:

Entry Submitted: 10/12/2012
Entry Accepted: 10/13/2012
Entry Last Modified: 04/04/2014

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society