-

 

 

 




Optimization Online





 

Global Convergence of General Derivative-Free Trust-Region Algorithms to First and Second Order Critical Points

A. R. Conn (arconn***at***us.ibm.com)
K. Scheinberg (katyas***at***us.ibm.com)
L. N. Vicente (lnv***at***mat.uc.pt)

Abstract: In this paper we prove global convergence for first and second-order stationarity points of a class of derivative-free trust-region methods for unconstrained optimization. These methods are based on the sequential minimization of linear or quadratic models built from evaluating the objective function at sample sets. The derivative-free models are required to satisfy Taylor-type bounds but, apart from that, the analysis is independent of the sampling techniques. A number of new issues are addressed, including global convergence when acceptance of iterates is based on simple decrease of the objective function, trust-region radius maintenance at the criticality step, and global convergence for second-order critical points.

Keywords: trust-region methods, derivative-free optimization, nonlinear optimization, global convergence

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Preprint 06-49, Department of Mathematics, University of Coimbra, Portugal, October 2006

Download: [PDF]

Entry Submitted: 10/26/2006
Entry Accepted: 10/26/2006
Entry Last Modified: 10/26/2006

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society