Optimization Online


Golden Ratio Algorithms for Variational Inequalities

Yura Malitsky(y.malitsky***at***gmail.com)

Abstract: The paper presents a fully explicit algorithm for monotone variational inequalities. The method uses variable stepsizes that are computed using two previous iterates as an approximation of the local Lipschitz constant without running a linesearch. Thus, each iteration of the method requires only one evaluation of a monotone operator $F$ and a proximal mapping $g$. The operator $F$ need not be Lipschitz-continuous, which also makes the algorithm interesting in the area of composite minimization where one cannot use the descent lemma. The method exhibits an ergodic $O(1/k)$ convergence rate and $R$-linear rate, if $F, g$ satisfy the error bound condition. We discuss possible applications of the method to fixed point problems. Furthermore, we show theoretically that the method still converges under a new relaxed monotonicity condition and confirm numerically that it can robustly work even for some highly nonmonotone/nonconvex problems.

Keywords: variational inequality, first-order methods, linesearch, saddle point problem, composite minimization, fixed point problem

Category 1: Complementarity and Variational Inequalities

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization )


Download: [PDF]

Entry Submitted: 04/27/2018
Entry Accepted: 05/01/2018
Entry Last Modified: 04/27/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society