Optimization Online


A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search

Yu-Hong Dai(dyh***at***lsec.cc.ac.cn)
Cai-Xia Kou(koucx***at***lsec.cc.ac.cn)

Abstract: In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild conditions. To accelerate the algorithm, we develop an adaptive strategy to choose the initial stepsize and introduce dynamic restarts along negative gradients based on how the function is close to some quadratic function during some previous iterations. Numerical experiments with the CUTEr collection show that the proposed algorithm is promising.

Keywords: conjugate gradient method, memoryless BFGS method, unconstrained optimization, global convergence, Wolfe line search

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Report, AMSS, Chinese Academy of Sciences, Beijing, October, 2010.

Download: [PDF]

Entry Submitted: 06/28/2011
Entry Accepted: 06/28/2011
Entry Last Modified: 06/28/2011

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society