- A Note on the Forward-Douglas--Rachford Splitting for Monotone Inclusion and Convex Optimization Hugo Raguet(hugo.raguetgmail.com) Abstract: We shed light on the structure of the three-operator'' version of the forward-Douglas--Rachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it is a straightforward extension of a fixed-point algorithm proposed by us as a generalization of the forward-backward splitting algorithm, initially designed for finding a zero of a sum of an arbitrary number of maximally monotone operators $\sum_{i=1}^n A_i + B$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of each $A_i$ separately. We argue that, the former is the true'' forward-Douglas--Rachford splitting algorithm, in contrast to the initial use of this designation in the literature. Then, we highlight the extension to an arbitrary number of maximally monotone operators in the splitting, $\sum_{i=1}^{n} A_i + B + C$, in a formulation admitting preconditioning operators. We finally demonstate experimentally its interest in the context of nonsmooth convex optimization. Keywords: forward-backward splitting; Douglas--Rachford splitting; monotone operator splitting; proximal splitting; nonsmooth convex optimization. Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Category 2: Convex and Nonsmooth Optimization (Generalized Convexity/Monoticity ) Category 3: Applications -- Science and Engineering (Biomedical Applications ) Citation: Download: [PDF]Entry Submitted: 04/23/2017Entry Accepted: 04/23/2017Entry Last Modified: 04/23/2017Modify/Update this entry Visitors Authors More about us Links Subscribe, Unsubscribe Digest Archive Search, Browse the Repository Submit Update Policies Coordinator's Board Classification Scheme Credits Give us feedback Optimization Journals, Sites, Societies Optimization Online is supported by the Mathematical Optmization Society.