-

 

 

 




Optimization Online





 

On the iterate convergence of descent methods for convex optimization

Clovis Gonzaga (ccgonzaga1***at***gmail.com)

Abstract: We study the iterate convergence of strong descent algorithms applied to convex functions. We assume that the function satisfies a very simple growth condition around its minimizers, and then show that the trajectory described by the iterates generated by any such method has finite length, which proves that the sequence of iterates converge.

Keywords: Convex optimization, descent methods

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Federal University of Santa Catarina, Brazil, May 2014 ccgonzaga1@gmail.com

Download: [PDF]

Entry Submitted: 09/11/2014
Entry Accepted: 09/12/2014
Entry Last Modified: 07/09/2015

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society