-

 

 

 




Optimization Online





 

On the divergence of line search methods

Walter F. Mascarenhas (walterfm***at***ime.usp.br)

Abstract: We discuss the convergence of line search methods for minimization. We explain how Newton's method and the BFGS method can fail even if the restrictions of the objective function to the search lines are strictly convex functions, the level sets of the objective functions are compact, the line searches are exact and the Wolfe conditions are satisfied. This explanation illustrates a new way to combine general mathematical concepts and symbolic computation to analyze the convergence of line search methods. It also illustrate the limitations of the asymptotic analysis of the iterates of nonlinear programming algorithms.

Keywords: Line search methods

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation: Accepted by Computational and Applied Mathematics

Download: [PDF]

Entry Submitted: 06/06/2006
Entry Accepted: 06/06/2006
Entry Last Modified: 06/06/2006

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society