Optimization Online


A unified convergence theory for Non monotone Direct Search Methods (DSMs) with extensions \\ to DFO with mixed and categorical variables

Ubaldo Garcia Palomares (ubaldo***at***gti.uvigo.es)

Abstract: This paper presents a unified convergence theory for non monotonous Direct Search Methods (DSMs), which embraces several algorithms that have been proposed for the solution of unconstrained and boxed constraints models. This paper shows that these models can be theoretically solved with the same methodology and under the same weak assumptions. All proofs have a common pattern and the monotone counterparts are merely special cases. The theory holds regardless of the absence of first derivatives and is extended to models with discrete and/or categorical variables. It is proved that a Clarke stationary point is obtained with probability one, with a finite number of search directions at all iterations.The key for convergence rests upon a judicious choice of a random set of search directions, which we assume to contain a subset of orthogonal directions, because an easy convergence proof can be exposed for both smooth and non smooth functions.

Keywords: Non monotone. Direct Search Methods. Mixd and Categorical Variables. Deivative Free.

Category 1: Nonlinear Optimization (Bound-constrained Optimization )

Citation: Unpublished. Internal report 2019. Universidade de Vigo, Information Tecknology Group. Submitted for publication on October 2019

Download: [PDF]

Entry Submitted: 10/12/2019
Entry Accepted: 10/12/2019
Entry Last Modified: 10/22/2019

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society