Optimization Online


Convergence rates for an inertial algorithm of gradient type associated to a smooth nonconvex minimization

Laszlo Szilard Csaba (laszlosziszi***at***yahoo.com)

Abstract: We investigate an inertial algorithm of gradient type in connection with the minimization of a nonconvex differentiable function. The algorithm is formulated in the spirit of Nesterov's accelerated convex gradient method. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satis es the Kurdyka-Lojasiewicz property. Further, we provide convergence rates for the generated sequences and the function values formulated in terms of the Lojasiewicz exponent.

Keywords: inertial algorithm, nonconvex optimization, Kurdyka- Lojasiewicz inequality, convergence rate

Category 1: Other Topics (Other )

Citation: S. László, 2018, Technical University of Cluj-Napoca, Department of Mathematics, Str. Memorandumului nr. 28, 400114 Cluj-Napoca, Romania, e-mail: laszlosziszi@yahoo.com

Download: [PDF]

Entry Submitted: 06/19/2018
Entry Accepted: 06/19/2018
Entry Last Modified: 11/22/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society