New Adaptive Stepsize Selections in Gradient Methods
G. Frassoldati (giacomofrassoldatigmail.com)
Abstract: This paper deals with gradient methods for minimizing n-dimensional strictly convex quadratic functions. Two new adaptive stepsize selection rules are presented and some key properties are proved. Practical insights on the effectiveness of the proposed techniques are given by a numerical comparison with the Barzilai-Borwein (BB) method, the cyclic/adaptive BB methods and two recent monotone gradient methods.
Keywords: unconstrained optimization, strictly convex quadratics, gradient methods, adaptive stepsize selections.
Category 1: Nonlinear Optimization (Unconstrained Optimization )
Citation: Journal of Industrial and Management Optimization (2007), to appear. (Formerly Technical Report n. 77, Department of Pure and Applied Mathematics, University of Modena and Reggio Emilia, Modena (Italy), January 2007)
Entry Submitted: 01/31/2007
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|