Optimization Online


A Derivative-Free Algorithm for the Least-square minimization

Hongchao Zhang(hozhang***at***math.lsu.edu)
Andrew Conn(arconn***at***us.ibm.com)
Katya Scheinberg(katyascheinberg***at***gmail.com)

Abstract: We develop a framework for a class of derivative-free algorithms for the least-squares minimization problem. These algorithms are based on polynomial interpolation models and are designed to take advantages of the problem structure. Under suitable conditions, we establish the global convergence and local quadratic convergence properties of these algorithms. Promising numerical results indicate the algorithm is efficient and robust when finding both low accuracy and high accuracy solutions. Comparisons are made with standard derivative-free software packages that do not exploit the special structure of the least-squares problem or that use finite differences to approximate the gradients.

Keywords: Derivative-Free Optimization, Least-squares, Trust Region, Levenberg-Marquardt Method, System of Nonlinear Equations, Global Convergence, Local Convergence

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Nonlinear Systems and Least-Squares )

Citation: Technical Report, Department of Mathematics, Louisiana State University, April 6th, 2009

Download: [PDF]

Entry Submitted: 04/06/2009
Entry Accepted: 04/06/2009
Entry Last Modified: 04/06/2009

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society