Optimization Online


On Quasi-Newton Forward--Backward Splitting: Proximal Calculus and Convergence

Stephen Becker(stephen.becker***at***colorado.edu)
Jalal Fadili(Jalal.Fadili***at***ensicaen.fr)
Peter Ochs(ochs***at***math.uni-sb.de)

Abstract: We introduce a framework for quasi-Newton forward--backward splitting algorithms (proximal quasi-Newton methods) with a metric induced by diagonal +/- rank-r symmetric positive definite matrices. This special type of metric allows for a highly efficient evaluation of the proximal mapping. The key to this efficiency is a general proximal calculus in the new metric. By using duality, formulas are derived that relate the proximal mapping in a rank-r modified metric to the original metric. We also describe efficient implementations of the proximity calculation for a large class of functions; the implementations exploit the piece-wise linear nature of the dual problem. Then, we apply these results to acceleration of composite convex minimization problems, which leads to elegant quasi-Newton methods for which we prove convergence. The algorithm is tested on several numerical examples and compared to a comprehensive list of alternatives in the literature. Our quasi-Newton splitting algorithm with the prescribed metric compares favorably against state-of-the-art. The algorithm has extensive applications including signal processing, sparse recovery, machine learning and classification to name a few.

Keywords: forward-backward splitting, quasi-Newton, proximal calculus, duality

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: arXiv:1801.08691

Download: [PDF]

Entry Submitted: 01/29/2018
Entry Accepted: 01/29/2018
Entry Last Modified: 01/29/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society