Optimization Online


New Adaptive Stepsize Selections in Gradient Methods

G. Frassoldati (giacomofrassoldati***at***gmail.com)
L. Zanni (zanni.luca***at***unimore.it)
G. Zanghirati (g.zanghirati***at***unife.it)

Abstract: This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functions. Two new adaptive stepsize selection rules are presented and some key properties are proved. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the Barzilai-Borwein (BB) method, the cyclic/adaptive BB methods and two recent monotone gradient methods.

Keywords: unconstrained optimization, strictly convex quadratics, gradient methods, adaptive stepsize selections.

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Journal of Industrial and Management Optimization (2007), to appear. (Formerly Technical Report n. 77, Department of Pure and Applied Mathematics, University of Modena and Reggio Emilia, Modena (Italy), January 2007)

Download: [Compressed Postscript][PDF]

Entry Submitted: 01/31/2007
Entry Accepted: 01/31/2007
Entry Last Modified: 11/16/2007

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society