Optimization Online


Splitting methods with variable metric for KL functions

Pierre Frankel(p.frankel30***at***orange.fr)
Guillaume Garrigos(guillaume.garrigos***at***gmail.com)
Juan Peypouquet(juan.peypouquet***at***usm.cl)

Abstract: We study the convergence of general abstract descent methods applied to a lower semicontinuous nonconvex function f that satis es the Kurdyka-Lojasiewicz inequality in a Hilbert space. We prove that any precompact sequence converges to a critical point of f and obtain new convergence rates both for the values and the iterates. The analysis covers alternating versions of the forward-backward method with variable metric and relative errors. As an example, a nonsmooth and nonconvex version of the Levenberg-Marquardt algorithm is detailled.

Keywords: Nonconvex and nonsmooth optimization ; Kurdyka- Lojasiewicz inequality ; Descent methods ; Convergence rates ; Variable metric ; Gauss-Seidel method ; Newton-like method

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Other Topics (Other )

Citation: Submitted at Journal of Optimization Theory and Applications (JOTA).

Download: [PDF]

Entry Submitted: 06/06/2014
Entry Accepted: 06/06/2014
Entry Last Modified: 06/06/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society