Convergence of a hybrid projection-proximal point algorithm coupled with approximation methods in convex optimization

In order to minimize a closed convex function that is approximated by a sequence of better behaved functions, we investigate the global convergence of a generic diagonal hybrid algorithm, which consists of an inexact relaxed proximal point step followed by a suitable orthogonal projection onto a hyperplane. The latter permits to consider a fixed relative error criterion for the proximal step. We provide various sets of conditions ensuring the global convergence of this algorithm. The analysis is valid for nonsmooth data in infinite-dimensional Hilbert spaces. Some examples are presented, in particular some penalty/barrier methods in convex programming. We also show that some results can be adapted to the zero-finding problem for a maximal monotone operator.

Citation

Mathematics of Operations Research Vol. 30, No. 4, November 2005, pp. 966-984