Optimization Online


A Pattern Search Filter Method for Nonlinear Programming without Derivatives

Charles Audet (charlesa***at***crt.umontreal.ca)
J.E.Jr Dennis (dennis***at***caam.rice.edu)

Abstract: This paper presents and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that either improves the objective function value or the value of some function that measures the constraint violation. The new algorithm does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. It reduces trivially to the Torczon GPS (generalized pattern search) algorithm when there are no constraints, and indeed, it is formulated here to reduce to the version of GPS designed to handle finitely many linear constraints if they are treated explicitly. A key feature is that it preserves the useful division into {\sc search} and {\sc poll} steps. Assuming local smoothness, the algorithm produces a KKT point for a problem related to the original problem.

Keywords: Pattern search algorithm, filter algorithm, surrogate-based optimization, derivative-free convergence analysis, constrained optimization, nonlinear programming.

Category 1: Nonlinear Optimization (Constrained Nonlinear Optimization )

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: Technical report 00-09, Department of Computational and Applied Mathematics, Rice University, Houston TX, March 2000.

Download: [Postscript]

Entry Submitted: 03/16/2001
Entry Accepted: 03/16/2001
Entry Last Modified: 03/16/2001

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society