Convergence rates of proximal gradient methods via the convex conjugate
David Gutman (dgutmanandrew.cmu.edu)
Abstract: We give a novel proof of the $O(1/k)$ and $O(1/k^2)$ convergence rates of the proximal gradient and accelerated proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function.
Keywords: convex conjugate, proximal gradient, acceleration
Category 1: Convex and Nonsmooth Optimization (Convex Optimization )
Citation: Technical Report, Carnegie Mellon University, January 2018.
Entry Submitted: 01/08/2018
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|