- Convergence rate of inexact proximal point methods with relative error criteria for convex optimization Renato DC Monteiro (monteiroisye.gatech.edu) Benar F Svaiter (benarimpa.br) Abstract: In this paper, we consider a class of inexact proximal point methods for convex optimization which allows a relative error tolerance in the approximate solution of each proximal subproblem. By exploiting the special structure of convex optimization problems, we are able to derive surprising complexity bounds for the aforementioned class. As a consequence, we show that the best size of the projected gradients (resp., gradients) generated by the projected gradient (resp., steepest descent) method up to iteration $k$ is ${\cal O}(1/k)$ in the context of smooth convex optimization problems. Keywords: projected gradient, forward-backward splitting , inexact proximal point, convex optimization, steepest descent, complexity Category 1: Nonlinear Optimization Category 2: Convex and Nonsmooth Optimization Citation: Download: [PDF]Entry Submitted: 08/24/2010Entry Accepted: 08/24/2010Entry Last Modified: 05/26/2012Modify/Update this entry Visitors Authors More about us Links Subscribe, Unsubscribe Digest Archive Search, Browse the Repository Submit Update Policies Coordinator's Board Classification Scheme Credits Give us feedback Optimization Journals, Sites, Societies Optimization Online is supported by the Mathematical Optmization Society.