Optimization Online


On the acceleration of the Barzilai-Borwein method

Yakui Huang (huangyakui2006***at***gmail.com)
Yu-Hong Dai (dyh***at***lsec.cc.ac.cn)
Xin-Wei Liu (mathlxw***at***hebut.edu.cn)
Hongchao Zhang (hozhang***at***math.lsu.edu)

Abstract: The Barzilai-Borwein (BB) gradient method is efficient for solving large-scale unconstrained problems to the modest accuracy and has a great advantage of being easily extended to solve a wide class of constrained optimization problems. In this paper, we propose a new stepsize to accelerate the BB method by requiring finite termination for minimizing two-dimensional strongly convex quadratic function. Combing with this new stepsize, we develop gradient methods which adaptively take the nonmonotone BB stepsizes and certain monotone stepsizes for minimizing general strongly convex quadratic function. Furthermore, by incorporating nonmonotone line searches and gradient projection techniques, we extend these new gradient methods to solve general smooth unconstrained and bound constrained optimization. Extensive numerical experiments show that our strategies of properly inserting monotone gradient steps into the nonmonotone BB method could significantly improve its performance and the new resulted methods can outperform the most successful gradient decent methods developed in the recent literature.

Keywords: Barzilai-Borwein method; gradient methods; finite termination; unconstrained optimization; bound constrained optimization; quadratic optimization

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization


Download: [PDF]

Entry Submitted: 01/07/2020
Entry Accepted: 01/10/2020
Entry Last Modified: 11/29/2020

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society