Optimization Online


A view of algorithms for optimization without derivatives

M.J.D. Powell(mjdp***at***cam.ac.uk)

Abstract: Let the least value of a function of many variables be required. If its gradient is available, then one can tell whether search directions are downhill, and first order conditions help to identify the solution. It seems in practice, however, that the vast majority of unconstrained calculations do not employ any derivatives. A view of this situation is given, attention being restricted to methods that are suitable for noisy functions, and that change the variables in ways that are not random. Particular topics include the McKinnon (1998) example of the failure of the Nelder and Mead (1965) simplex method, some convergence properties of pattern search algorithms, and my own recent research on using quadratic models of the objective function. We find that least values of functions of more than 100 variables can be calculated.

Keywords: Pattern searches, Quadratic models, Simplex method, Unconstrained minimization, Without derivatives

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Report DAMTP 2007/NA03. CMS, University of Cambridge, Cambridge CB3 0WA, UK. April, 2007.

Download: [PDF]

Entry Submitted: 06/12/2007
Entry Accepted: 06/12/2007
Entry Last Modified: 06/12/2007

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society