Optimization Online


Composite Self-concordant Minimization

Quoc Tran Dinh(quoc.trandinh***at***epfl.ch)
Anastasios Kyrillidis(anastasios.kyrillidis***at***epfl.ch)
Volkan Cevher(volkan.cevher***at***epfl.ch)

Abstract: We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size selection and correction procedures based on the structure of the problem. We describe concrete algorithmic instances of our framework for several interesting large-scale applications and demonstrate them numerically on both synthetic and real data.

Keywords: Proximal-gradient/Newton method, composite minimization, self-concordance, sparse convex optimization, graph learning.

Category 1: Convex and Nonsmooth Optimization

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Category 3: Applications -- Science and Engineering (Statistics )

Citation: Technical Report 2013d, Laboratory for Information and Inference Systems, EPFL, Lausanne, Switzerland

Download: [PDF]

Entry Submitted: 08/15/2013
Entry Accepted: 08/15/2013
Entry Last Modified: 08/15/2013

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society