-

 

 

 




Optimization Online





 

A family of spectral gradient methods for optimization

Yu-Hong Dai (dyh***at***lsec.cc.ac.cn)
Yakui Huang (huangyakui2006***at***gmail.com)
Xin-Wei Liu (optim2008***at***163.com)

Abstract: We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the short Barzilai-Borwein (BB) stepsize and the long BB stepsize. It is shown that each member of the family shares certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is $R$-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is $R$-linearly convergent in the $n$-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising.

Keywords: unconstrained optimization; steepest descent method; spectral gradient method; $R$-linear convergence; $R$-superlinear convergence

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation:

Download: [PDF]

Entry Submitted: 05/18/2018
Entry Accepted: 05/18/2018
Entry Last Modified: 12/07/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society