Optimization Online


Generalized Forward-Backward Splitting

Hugo Raguet (raguet***at***ceremade.dauphine.fr)
Jalal Fadili (Jalal.Fadili***at***ensicaen.fr)
Gabriel Peyré (peyre***at***ceremade.dauphine.fr)

Abstract: This paper introduces the generalized forward-backward splitting algorithm for minimizing convex functions of the form $F + \sum_{i=1}^n G_i$, where $F$ has a Lipschitz-continuous gradient and the $G_i$'s are simple in the sense that their Moreau proximity operators are easy to compute. While the forward-backward algorithm cannot deal with more than $n = 1$ non-smooth function, our method generalizes it to the case of arbitrary $n$. Our method makes an explicit use of the regularity of $F$ in the forward step, and the proximity operators of the $G_i$'s are applied in parallel in the backward step. This allows the generalized forward backward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of $F$. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.

Keywords: Forward-backward algorithm, proximal splitting, convex optimization, image processing, total variation, wavelets.

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: Preprint Hal-00613637

Download: [PDF]

Entry Submitted: 08/22/2011
Entry Accepted: 08/22/2011
Entry Last Modified: 01/12/2012

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society