Optimization Online


A Parallel Inertial Proximal Optimization Method

Jean-Christophe Pesquet (pesquet***at***univ-mlv.fr)
Nelly Pustelnik (n.pustelnik***at***gmail.com)

Abstract: The Douglas-Rachford algorithm is a popular iterative method for finding a zero of a sum of two maximal monotone operators defined on a Hilbert space. In this paper, we propose an extension of this algorithm including inertia parameters and develop parallel versions to deal with the case of a sum of an arbitrary number of maximal operators. Based on this algorithm, parallel proximal algorithms are proposed to minimize over a linear subspace of a Hilbert space the sum of a finite number of proper, lower semicontinuous convex functions composed with linear operators. It is shown that particular cases of these methods are the simultaneous direction method of multipliers proposed by Stetzer et al., the parallel proximal algorithm developed by Combettes and Pesquet, and a parallelized version of an algorithm proposed by Attouch and Soueycatt.

Keywords: maximal monotone operators, convex optimization, Douglas-Rachford algorithm, parallel methods, proximal methods, augmented Lagrangian

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Optimization Software and Modeling Systems (Parallel Algorithms )

Category 3: Complementarity and Variational Inequalities

Citation: LIGM preprint, Universite de Paris-Est, Oct. 2010.

Download: [PDF]

Entry Submitted: 11/17/2010
Entry Accepted: 11/17/2010
Entry Last Modified: 07/01/2011

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society