A Nonmonontone Accelerated Proximal Gradient Method with Variable Stepsize Strategy for Nonsmooth and Nonconvex Minimization Problems

We propose a new nonmonontone accelerated proximal gradient method with variable stepsize strategy for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. In this algorithm, the objective function value be allowed to increase discontinuously, but is decreasing from the overall point of view. The variable stepsize strategy don't need a line search or to know the Lipschitz constant, which makes the algorithm easier to implement. Every sequence of iterates generated by the algorithm converges to a critical point of the objective function. Further, under the assumption that the objective function satisfies the Kurdyka-Lojasiewicz inequality, we prove the convergence rates of the objective function value and the iterates. Moreover, numerical results on both convex and nonconvex problems are reported to demonstrate the effectiveness and superiority of the proposed methods and stepsize strategy.

Article

Download

View A Nonmonontone Accelerated Proximal Gradient Method with Variable Stepsize Strategy for Nonsmooth and Nonconvex Minimization Problems