-

 

 

 




Optimization Online





 

A merit function approach for direct search

S. Gratton(serge.gratton***at***enseeiht.fr)
L.N. Vicente(lnv***at***mat.uc.pt)

Abstract: In this paper it is proposed to equip direct-search methods with a general procedure to minimize an objective function, possibly non-smooth, without using derivatives and subject to constraints on the variables. One aims at considering constraints, most likely nonlinear or non-smooth, for which the derivatives of the corresponding functions are also unavailable. The novelty of this contribution relies mostly on how relaxable constraints are handled. Such constraints, which can be relaxed during the course of the optimization, are taken care by a merit function and, if necessary, by a restoration procedure. Constraints that are unrelaxable, when present, are treated by an extreme barrier approach. One is able to show that the resulting merit function direct-search algorithm exhibits global convergence properties for first-order stationary constraints. As in the progressive barrier method [2], we provide a mechanism to indicate the transfer of constraints from the relaxable set to the unrelaxable one.

Keywords: Derivative-free optimization, direct-search methods, constraints, merit function, penalty parameter, random directions, non-smooth optimization.

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: Preprint 13-08, Dept. Mathematics, Univ. Coimbra.

Download: [PDF]

Entry Submitted: 04/23/2013
Entry Accepted: 04/23/2013
Entry Last Modified: 04/23/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society