Optimization Online


Acceleration of the PDHGM on strongly convex subspaces

Tuomo Valkonen(tuomov***at***iki.fi)
Thomas Pock(pock***at***icg.tugraz.at)

Abstract: We propose several variants of the primal-dual method due to Chambolle and Pock. Without requiring full strong convexity of the objective functions, our methods are accelerated on subspaces with strong convexity. This yields mixed rates, $O(1/N^2)$ with respect to initialisation and $O(1/N)$ with respect to the dual sequence, and the residual part of the primal sequence. We demonstrate the efficacy of the proposed methods on image processing problems lacking strong convexity, such as total generalised variation denoising and total variation deblurring.

Keywords: primal-dual, accelerated, subspace, total generalised variation

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Convex and Nonsmooth Optimization (Generalized Convexity/Monoticity )

Category 3: Applications -- Science and Engineering (Other )


Download: [PDF]

Entry Submitted: 11/23/2015
Entry Accepted: 11/23/2015
Entry Last Modified: 11/23/2015

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society