Optimization Online


An efficient gradient method using the Yuan steplength

Roberta De Asmundis (roberta.deasmundis***at***uniroma1.it)
Daniela di Serafino (daniela.diserafino***at***unina2.it)
William W. Hager (hager***at***math.ufl.edu)
Gerardo Toraldo (toraldo***at***unina.it)
Hongchao Zhang (hozhang***at***math.lsu.edu)

Abstract: We propose a new gradient method for quadratic programming, named SDC, which alternates some SD iterates with some gradient iterates that use a constant steplength computed through the Yuan formula. The SDC method exploits the asymptotic spectral behaviour of the Yuan steplength to foster a selective elimination of the components of the gradient along the eigenvectors of the Hessian matrix, i.e., to push the search in subspaces of smaller and smaller dimensions. The new method has global and R-linear convergence. Furthermore, numerical experiments show that it tends to outperform the Dai-Yuan method, which is one of the fastest methods among the gradient ones. In particular, SDC appears superior as the Hessian condition number and the accuracy requirement increase. Finally, if the number of consecutive SD iterates is not too small, the SDC method shows a monotonic behaviour.

Keywords: gradient methods, Yuan steplength, quadratic programming.

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Quadratic Programming )

Category 3: Nonlinear Optimization (Unconstrained Optimization )


Download: [PDF]

Entry Submitted: 10/28/2013
Entry Accepted: 10/28/2013
Entry Last Modified: 05/29/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society