-

 

 

 




Optimization Online





 

A unified framework for Bregman proximal methods: subgradient, gradient, and accelerated gradient schemes

David H. Gutman (dgutman***at***andrew.cmu.edu)
Javier Pena (jfp***at***andrew.cmu.edu)

Abstract: We provide a unified framework for analyzing the convergence of Bregman proximal first-order algorithms for convex minimization. Our framework hinges on properties of the convex conjugate and gives novel proofs of the convergence rates of the Bregman proximal subgradient, Bregman proximal gradient, and a new accelerated Bregman proximal gradient algorithm under fairly general and mild assumptions. Our accelerated Bregman proximal gradient algorithm attains the best-known accelerated rate of convergence when suitable relative smoothness and triangle scaling assumptions hold. However, the algorithm requires no prior knowledge of any related smoothness or triangle scaling constants.

Keywords: Bregman distance, proximal methods, first-order algorithms, convex conjugate, acceleration

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Working paper, Tepper School of Business, Carnegie Mellon University, December 2018.

Download: [PDF]

Entry Submitted: 12/25/2018
Entry Accepted: 12/26/2018
Entry Last Modified: 08/13/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society