Optimization Online


A gradient type algorithm with backward inertial steps for a nonconvex minimization

Szilard Csaba Laszlo(laszlosziszi***at***yahoo.com)
Adrian Viorel(oviorel***at***mail.utcluj.ro)
Cristian Alecsa(cristian.alecsa***at***math.ubbcluj.ro)

Abstract: We investigate an algorithm of gradient type with a backward inertial step in connection with the minimization of a nonconvex differentiable function. We show that the generated sequences converge to a critical point of the objective function, if a regularization of the objective function satis es the Kurdyka-Lojasiewicz property. Further, we provide convergence rates for the generated sequences and the objective function values formulated in terms of the Lojasiewicz exponent. Finally, some numerical experiments are presented in order to compare our numerical scheme with some algorithms well known in the literature.

Keywords: inertial algorithm, nonconvex optimization, Kurdyka- Lojasiewicz inequality, convergence rate

Category 1: Convex and Nonsmooth Optimization (Other )

Category 2: Combinatorial Optimization (Approximation Algorithms )


Download: [PDF]

Entry Submitted: 11/22/2018
Entry Accepted: 11/22/2018
Entry Last Modified: 11/22/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society