  


A Primaldual Threeoperator Splitting Scheme
Ming Yan(yanmmath.msu.edu) Abstract: In this paper, we propose a new primaldual algorithm for minimizing f(x)+g(x)+h(Ax), where f, g, and h are convex functions, f is differentiable with a Lipschitz continuous gradient, and A is a bounded linear operator. It has some famous primaldual algorithms for minimizing the sum of two functions as special cases. For example, it reduces to the ChambollePock algorithm when f=0 and a primaldual fixedpoint algorithm in [P. Chen, J. Huang, and X. Zhang, A primaldual fixedpoint algorithm for convex separable minimization with applications to image restoration, Inverse Problems, 29 (2013), p.025011] when g=0. In addition, it recovers the threeoperator splitting scheme in [D. Davis and W. Yin, A threeoperator splitting scheme and its optimization applications, arXiv:1504.01032, (2015)] when A is the identity operator. We prove the convergence of this new algorithm for the general case by showing that the iteration is a nonexpansive operator and derive the linear convergence rate with additional assumptions. Comparing to other primaldual algorithms for solving the same problem, this algorithm extends the range of acceptable parameters to ensure the convergence and has a smaller periteration cost. The numerical experiments show the efficiency of this new algorithm by comparing to other primaldual algorithms. Keywords: fixedpoint iteration; nonexpansive operator; primaldual; threeoperator splitting Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization ) Citation: Download: [PDF] Entry Submitted: 12/14/2016 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  