Optimization Online


A Wide Interval for Efficient Self-Scaling Quasi-Newton Algorithms

M. Al-Baali (albaali***at***squ.edu.om)
H. Khalfan (hfayez***at***emirates.net.ae)

Abstract: This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their chances of success as well. Self-scaling updates from the preconvex and the postconvex classes are shown to be effective in practice, and new algorithms which work well in practice, with or without scaling, are also obtained from the new interval. Unlike the behaviour of unscaled methods, numerical testing shows that varying the updating parameter in the proposed interval has little effect on the performance of the self-scaling algorithms.

Keywords: Unconstrained optimization, quasi-Newton methods, self-scaling, global and superlinear convergence.

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: DOMAS 03/1, Sulatn Qaboos University, Oman, July, 2003.

Download: [Postscript][PDF]

Entry Submitted: 08/02/2003
Entry Accepted: 08/02/2003
Entry Last Modified: 08/02/2003

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society