Optimization Online


A globally convergent modified conjugate-gradient line-search algorithm with inertia controlling

Wenwen Zhou (wenwen.zhou***at***sas.com)
Joshua D. Griffin (joshua.griffin***at***sas.com)
Ioannis G. Akrotirianakis (ioannis.akrotirianakis***at***sas.com)

Abstract: In this paper we have addressed the problem of unboundedness in the search direction when the Hessian is indefinite or near singular. A new algorithm has been proposed which naturally handles singular Hessian matrices, and is theoretically equivalent to the trust-region approach. This is accomplished by performing explicit matrix modifications adaptively that mimic the implicit modifications used by trust-region methods. Further, we provide a new variant of modified conjugate gradient algorithms which implements this strategy in a robust and efficient way. Numerical results are provided demonstrating the effectiveness of this approach in the context of a line-search method for large-scale unconstrained nonconvex optimization.

Keywords: nonlinear programming, unconstrained optimization, trust region methods, conjugate gradient method

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation: Technical Report 2009-01, SAS Institute Inc., 100 SAS Campus Dr, Cary, NC 27513 USA.

Download: [PDF]

Entry Submitted: 09/23/2009
Entry Accepted: 09/23/2009
Entry Last Modified: 10/01/2009

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society