-

 

 

 




Optimization Online





 

Several variants of the primal-dual hybrid gradient algorithm with applications

Jianchao Bai (bjc1987***at***163.com)
Jicheng Li (jcli***at***mail.xjtu.edu.cn)

Abstract: By reviewing the primal-dual hybrid algorithm (PDHA) proposed by He, You and Yuan (SIAM J. Imaging Sci. 2014;7(4):2526-2537), in this paper we introduce four improved schemes for solving a class of generalized saddle-point problems. By making use of the variational inequality, weaker conditions are presented to ensure the global convergence of the proposed algorithms, where none of the objective functions are assumed to be strongly convex and the step-sizes in the primal-dual updates are more flexible than the previous. The global convergence and worst-case sublinear convergence rate in the ergodic/nonergodic sense are analyzed in detail. And the numerical efficiency of our algorithms is verified by testing an image deblurring problem compared with several existing algorithms.

Keywords: Generalized saddle-point problem; Primal-dual hybrid gradient algorithm; Variational inequality; Global convergence; Complexity

Category 1: Complementarity and Variational Inequalities

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: submitted

Download: [PDF]

Entry Submitted: 04/19/2017
Entry Accepted: 04/19/2017
Entry Last Modified: 11/04/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society