-

 

 

 




Optimization Online





 

Worst-case convergence analysis of gradient and Newton methods through semidefinite programming performance estimation

Etienne De Klerk(e.deklerk***at***uvt.nl)
Francois Glineur(Francois.Glineur***at***uclouvain.be)
Adrien Taylor(adrienbtaylor***at***gmail.com)

Abstract: We provide new tools for worst-case performance analysis of the gradient (or steepest descent) method of Cauchy for smooth strongly convex functions, and Newton's method for self-concordant functions. The analysis uses semidefinite programming performance estimation, as pioneered by Drori en Teboulle [Mathematical Programming, 145(1-2):451--482, 2014], and extends recent performance estimation results for the method of Cauchy by the authors [Optimization Letters, to appear]. To illustrate the applicability of the tools, we sketch how to give a rigorous worst-case complexity analysis of a recent interior point method by Abernethy and Hazan [arXiv 1507.02528, 2015]. The algorithm of Abernethy and Hazan has sparked much recent interest, since it demonstrates the formal equivalence between an interior point method and a simulated annealing algorithm for convex optimization, but several details of the worst-case analysis are omitted in their paper.

Keywords: simulated annealing, interior point methods, performance estimation problems, semidefinite programming

Category 1: Linear, Cone and Semidefinite Programming

Category 2: Nonlinear Optimization

Citation:

Download: [PDF]

Entry Submitted: 09/18/2017
Entry Accepted: 09/18/2017
Entry Last Modified: 09/18/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society