A unified framework for Bregman proximal methods: subgradient, gradient, and accelerated gradient schemes
David H. Gutman (dgutmanandrew.cmu.edu)
Abstract: We provide a unified framework for analyzing the convergence of Bregman proximal first-order algorithms for convex minimization. Our framework hinges on properties of the convex conjugate and gives novel proofs of the convergence rates of the Bregman proximal subgradient, Bregman proximal gradient, and a new accelerated Bregman proximal gradient algorithm under fairly general and mild assumptions. Our accelerated Bregman proximal gradient algorithm attains the best-known accelerated rate of convergence when suitable relative smoothness and triangle scaling assumptions hold. However, the algorithm requires no prior knowledge of any related smoothness or triangle scaling constants.
Keywords: Bregman distance, proximal methods, first-order algorithms, convex conjugate, acceleration
Category 1: Convex and Nonsmooth Optimization (Convex Optimization )
Citation: Working paper, Tepper School of Business, Carnegie Mellon University, December 2018.
Entry Submitted: 12/25/2018
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|