Composite Self-concordant Minimization
Quoc Tran Dinh(quoc.trandinhepfl.ch)
Abstract: We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size selection and correction procedures based on the structure of the problem. We describe concrete algorithmic instances of our framework for several interesting large-scale applications and demonstrate them numerically on both synthetic and real data.
Keywords: Proximal-gradient/Newton method, composite minimization, self-concordance, sparse convex optimization, graph learning.
Category 1: Convex and Nonsmooth Optimization
Category 2: Convex and Nonsmooth Optimization (Convex Optimization )
Category 3: Applications -- Science and Engineering (Statistics )
Citation: Technical Report 2013d, Laboratory for Information and Inference Systems, EPFL, Lausanne, Switzerland
Entry Submitted: 08/15/2013
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|