-

 

 

 




Optimization Online





 

A SECOND DERIVATIVE SQP METHOD WITH IMPOSED DESCENT

Nicholas I M Gould (N.I.M.Gould***at***rl.ac.uk)
Daniel P Robinson (daniel.robinson***at***comlab.ox.ac.uk)

Abstract: Sequential quadratic programming (SQP) methods form a class of highly efficient algorithms for solving nonlinearly constrained optimization problems. Although second derivative information may often be calculated, there is little practical theory that justifies exact-Hessian SQP methods. In particular, the resulting quadratic programming (QP) subproblems are often nonconvex, and thus finding their global solutions may be computationally nonviable. This paper presents a second-derivative S$\ell_1$QP method based on quadratic subproblems that are either convex, and thus may be solved efficiently, or need not be solved globally. Additionally, an explicit descent constraint is imposed on certain QP subproblems, which ``guides'' the iterates through areas in which nonconvexity is a concern. Global convergence of the resulting algorithm is established.

Keywords: Nonlinear programming, nonlinear inequality constraints, sequential quadratic programming, \ell_1 penalty function, nonsmooth optimization

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation: University of Oxford Computing Laboratory, Technical Report 08/09, June 2008.

Download: [PDF]

Entry Submitted: 06/18/2008
Entry Accepted: 06/18/2008
Entry Last Modified: 06/18/2008

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society