Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method
Frank E. Curtis (frank.e.curtisgmail.com)
Abstract: We propose a limited memory steepest descent (LMSD) method for solving unconstrained optimization problems. As a steepest descent method, the step computation in each iteration requires the evaluation of a gradient of the objective function and the calculation of a scalar step size only. When employed to solve certain convex problems, our method reduces to a variant of the LMSD method proposed by Fletcher (2012, Math. Program., 135, 413–436), which means that, when the history length parameter is set to 1, it reduces to a steepest descent method inspired by that proposed by Barzilai & Borwein (1988, IMA J. Numer. Anal., 8, 141–148). However, our method is novel in that we propose new algorithmic features for cases when nonpositive curvature is encountered. That is, our method is particularly suited for solving nonconvex problems. With a nonmonotone line search, we ensure global convergence for a variant of our method. We also illustrate with numerical experiments that our approach often yields superior performance when employed to solve nonconvex problems.
Keywords: unconstrained optimization, nonconvex optimization, steepest descent methods, Barzilai-Borwein methods, limited memory methods
Category 1: Nonlinear Optimization
Category 2: Nonlinear Optimization (Unconstrained Optimization )
Citation: F. E. Curtis and W. Guo, "Handling Nonpositive Curvature in a Limited Memory Steepest Descent Method," IMA Journal of Numerical Analysis, vol. DOI: 10.1093/imanum/drv034, 2015.
Entry Submitted: 11/07/2014
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|