Splitting methods with variable metric for KL functions
Abstract: We study the convergence of general abstract descent methods applied to a lower semicontinuous nonconvex function f that satises the Kurdyka-Lojasiewicz inequality in a Hilbert space. We prove that any precompact sequence converges to a critical point of f and obtain new convergence rates both for the values and the iterates. The analysis covers alternating versions of the forward-backward method with variable metric and relative errors. As an example, a nonsmooth and nonconvex version of the Levenberg-Marquardt algorithm is detailled.
Keywords: Nonconvex and nonsmooth optimization ; Kurdyka- Lojasiewicz inequality ; Descent methods ; Convergence rates ; Variable metric ; Gauss-Seidel method ; Newton-like method
Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )
Category 2: Other Topics (Other )
Citation: Submitted at Journal of Optimization Theory and Applications (JOTA).
Entry Submitted: 06/06/2014
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|