  


A Note on the ForwardDouglasRachford Splitting for Monotone Inclusion and Convex Optimization
Hugo Raguet(hugo.raguetgmail.com) Abstract: We shed light on the structure of the ``threeoperator'' version of the forwardDouglasRachford splitting algorithm for finding a zero of a sum of maximally monotone operators $A + B + C$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of $A$ and of $C$, separately. We show that it is a straightforward extension of a fixedpoint algorithm proposed by us as a generalization of the forwardbackward splitting algorithm, initially designed for finding a zero of a sum of an arbitrary number of maximally monotone operators $\sum_{i=1}^n A_i + B$, where $B$ is cocoercive, involving only the computation of $B$ and of the resolvent of each $A_i$ separately. We argue that, the former is the ``true'' forwardDouglasRachford splitting algorithm, in contrast to the initial use of this designation in the literature. Then, we highlight the extension to an arbitrary number of maximally monotone operators in the splitting, $\sum_{i=1}^{n} A_i + B + C$, in a formulation admitting preconditioning operators. We finally demonstate experimentally its interest in the context of nonsmooth convex optimization. Keywords: forwardbackward splitting; DouglasRachford splitting; monotone operator splitting; proximal splitting; nonsmooth convex optimization. Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Category 2: Convex and Nonsmooth Optimization (Generalized Convexity/Monoticity ) Category 3: Applications  Science and Engineering (Biomedical Applications ) Citation: Download: [PDF] Entry Submitted: 04/23/2017 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  