Optimization Online


Local Linear Convergence of Forward–Backward under Partial Smoothness

Jingwei Liang (jingwei.liang***at***ensicaen.fr)
Jalal Fadili (jalal.Fadili***at***ensicaen.fr)
Gabriel Peyré (peyre***at***ceremade.dauphine.fr)

Abstract: In this paper, we consider the Forward–Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz–continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$. We propose a unified framework in which we show that the Forward–Backward (i) correctly identifies the active manifold $\mathcal{M}$ in a finite number of iterations, and then (ii) enters a local linear convergence regime that we characterize precisely. This explains the typical behaviour that has been observed numerically for many problems encompassed in our framework, including the Lasso, the group Lasso, the fused Lasso and the nuclear norm regularization to name a few. These results may have numerous applications including in signal/image processing processing, sparse recovery and machine learning.

Keywords: Partial smoothness, Activity identification, Local linear convergence, Forward– Backward splitting

Category 1: Convex and Nonsmooth Optimization

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: GREYC CNRS UMR 6072, ENSICAEN 6, Bd du Maréchal Juin 14050 Caen Cedex, France

Download: [PDF]

Entry Submitted: 07/21/2014
Entry Accepted: 07/21/2014
Entry Last Modified: 10/19/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society