A Nonlinear Conjugate Gradient Algorithm with An Optimal Property and An Improved Wolfe Line Search
Abstract: In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild conditions. To accelerate the algorithm, we develop an adaptive strategy to choose the initial stepsize and introduce dynamic restarts along negative gradients based on how the function is close to some quadratic function during some previous iterations. Numerical experiments with the CUTEr collection show that the proposed algorithm is promising.
Keywords: conjugate gradient method, memoryless BFGS method, unconstrained optimization, global convergence, Wolfe line search
Category 1: Nonlinear Optimization (Unconstrained Optimization )
Citation: Report, AMSS, Chinese Academy of Sciences, Beijing, October, 2010.
Entry Submitted: 06/28/2011
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|