A proximal-Newton method for unconstrained convex optimization in Hilbert spaces

We propose and study the iteration-complexity of a proximal-Newton method for finding approximate solutions of the problem of minimizing a twice continuously differentiable convex function on a (possibly infinite dimensional) Hilbert space. We prove global convergence rates for obtaining approximate solutions in terms of function/gradient values. Our main results follow from an iteration-complexity study of an (large-step) inexact proximal point method for solving convex minimization problems.

Article

Download

View A proximal-Newton method for unconstrained convex optimization in Hilbert spaces