Optimization Online


A new family of high order directions for unconstrained optimization inspired by Chebyshev and Shamanskii methods

Bilel Kchouk (Bilel.Kchouk***at***Usherbrooke.ca)
Jean-Pierre Dussault (Jean-Pierre.Dussault***at***Usherbrooke.ca)

Abstract: The 1669-1670 Newton-Raphson's method is still used to solve equations systems and unconstrained optimization problems. Since this method, some other algorithms inspired by Newton's have been proposed: in 1839 Chebyshev developped a high order cubical convergence algorithm, and in 1967 Shamanskii proposed an acceleration of Newton's method. By considering a Newton-type methods as displacement directions, we introduce in this paper new high order algorithms extending these famous methods. We provide convergence order results and per iteration complexity analysis to predict the efficiency of such iterative processes. Preliminary numerous examples confirm the applicability of our analysis.

Keywords: Unconstrained optimization; Newton method; Chebyshev method; automatic differentiation; Shamanskii method

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation: submitted to Optimization Methods and Software (2011).


Entry Submitted: 12/05/2011
Entry Accepted: 12/05/2011
Entry Last Modified: 10/31/2012

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society