Convergence rates of proximal gradient methods via the convex conjugate

We give a novel proof of the $O(1/k)$ and $O(1/k^2)$ convergence rates of the proximal gradient and accelerated proximal gradient methods for composite convex minimization. The crux of the new proof is an upper bound constructed via the convex conjugate of the objective function.

Citation

Technical Report, Carnegie Mellon University, January 2018.

Article

Download

View Convergence rates of proximal gradient methods via the convex conjugate