Optimization Online


Proximal splitting algorithms: Relax them all!

Laurent Condat(laurent.condat***at***kaust.edu.sa)
Daichi Kitahara(d-kita***at***fc.ritsumei.ac.jp)
Andres Contreras(acontreras***at***dim.uchile.cl)
Akira Hirabayashi(akirahrb***at***media.ritsumei.ac.jp)

Abstract: Convex optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, proximal splitting algorithms are particularly adequate: they consist of simple operations, by handling the terms in the objective function separately. We present several existing proximal splitting algorithms and we derive new ones, within a unified framework, which consists in applying splitting methods for monotone inclusions, like the forward-backward algorithm, in primal-dual product spaces with well-chosen metric. This allows us to derive new convergence theorems with larger parameter ranges. In particular, when the smooth term in the objective function is quadratic, e.g. for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Indeed, it is often the case in practice that the larger the relaxation parameter, the faster the convergence.


Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Nonlinear Optimization

Citation: Preprint arXiv:1912.00137

Download: [PDF]

Entry Submitted: 01/10/2020
Entry Accepted: 01/12/2020
Entry Last Modified: 01/10/2020

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society