-

 

 

 




Optimization Online





 

Convergence rates of proximal gradient methods via the convex conjugate

David Gutman (dgutman***at***andrew.cmu.edu)
Javier Pena (jfp***at***andrew.cmu.edu)

Abstract: We give a novel proof of the $O(1/k)$ and $O(1/k^2)$ convergence rates of the proximal gradient and accelerated proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function.

Keywords: convex conjugate, proximal gradient, acceleration

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Technical Report, Carnegie Mellon University, January 2018.

Download: [PDF]

Entry Submitted: 01/08/2018
Entry Accepted: 01/08/2018
Entry Last Modified: 01/08/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society