Optimization Online


A first-order primal-dual algorithm with linesearch

Yura Malitsky(y.malitsky***at***gmail.com)
Thomas Pock(pock***at***icg.tugraz.at)

Abstract: The paper proposes a linesearch for the primal-dual method. Each iteration of the linesearch requires to update only the dual (or primal) variable. For many problems, in particular for regularized least squares, the linesearch does not require any additional matrix-vector multiplications. We prove convergence of the proposed method under the standard assumptions. We also show an ergodic $O(1/N)$ rate of convergence for our method. In case when one of the prox-functions is strongly convex, we modify our basic method to get a better convergence rate. Finally, we propose the linesearch for a saddle point problem with an additional smooth term. Numerical experiments approve the efficiency of proposed methods.

Keywords: Saddle-point problems, first order algorithms, primal-dual algorithms, linesearch, convergence rates, backtracking

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: Institute for Computer Graphics and Vision, Graz University of Technology, 2016

Download: [PDF]

Entry Submitted: 08/31/2016
Entry Accepted: 08/31/2016
Entry Last Modified: 08/31/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society