-

 

 

 




Optimization Online





 

Proximal-like contraction methods for monotone variational inequalities in a unified framework

Bingsheng He (hebma***at***nju.edu.cn)
Li-Zhi Liao (liliao***at***hkbu.edu.hk)
Xiang Wang (xiangwang1982***at***yahoo.com)

Abstract: Approximate proximal point algorithms (abbreviated as APPAs) are classical approaches for convex optimization problems and monotone variational inequalities. To solve the subproblems of these algorithms, the projection method takes the iteration in form of $u^{k+1} = P_{\Omega}[u^k-\alpha_k d^k]$. Interestingly, many of them can be paired such that $%\exists \tilde{u}^k, \tilde{u}^k = P_{\Omega}[u^k - \beta_kF(v^k)] = P_{\Omega}[\tilde{u}^k - (d_2^k - G d_1^k)]$, where $\inf\{\beta_k\}>0$ and $G$ is a symmetric positive definite matrix. In other words, this projection equation offers a pair of geminate directions $d_1^k$ and $d_2^k$ for each step. In this paper, for various APPAs we first present a unified framework involving the above equations. Unified characterization is investigated for the contraction and convergence properties under the framework. This shows some essential views behind various outlooks. To study and pair various APPAs for different types of variational inequalities, we thus construct the above equations in different expressions according to the framework. Based on our constructed frameworks, it is interesting to see that, by choosing one of the geminate directions those studied proximal-like methods always utilize the unit step size namely $\alpha_k\equiv 1$. With the same effective quadruplet and the accepting rule, we then present a more efficient class of methods (called extended or general contraction methods), in which only minor extra even negligible costs are needed for a different step size in each iteration. A set of matrix approximation examples as well as six other groups of numerical experiments are constructed to compare the performance between the primary and extended (general) methods. In general, our numerical experiments show the performance of the extended (general) methods are much more promising than that of the primary ones.

Keywords: variational inequality, monotone, contraction methods

Category 1: Complementarity and Variational Inequalities

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Category 3: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation:

Download: [PDF]

Entry Submitted: 11/13/2008
Entry Accepted: 12/01/2008
Entry Last Modified: 05/12/2010

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society