Optimization Online


A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions

Tianyi Chen(tchen59***at***jhu.edu)
Frank E. Curtis(frank.e.curtis***at***gmail.com)
Daniel P. Robinson(daniel.p.robinson***at***gmail.com)

Abstract: We present a new method for minimizing the sum of a differentiable convex function and an $\ell_1$-norm regularizer. The main features of the new method include: $(i)$ an evolving set of indices corresponding to variables that are predicted to be nonzero at a solution (i.e., the support); $(ii)$ a reduced-space subproblem defined in terms of the predicted support; $(iii)$ conditions that determine how accurately each subproblem must be solved, which allow for Newton, Newton-CG, and coordinate-descent techniques to be employed; $(iv)$ a computationally practical condition that determines when the predicted support should be updated; and $(v)$ a reduced proximal gradient step that ensures sufficient decrease in the objective function when it is decided that variables should be added to the predicted support. We prove a convergence guarantee for our method and demonstrate its efficiency on a large set of model prediction problems.

Keywords: nonlinear optimization, convex optimization, sparse optimization, active-set methods, reduced-space methods, subspace minimization, model prediction

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Johns Hopkins University, Applied Mathematics and Statistics, Baltimore, MD, Technical Report OPT-2016/2.

Download: [PDF]

Entry Submitted: 02/19/2016
Entry Accepted: 02/19/2016
Entry Last Modified: 02/19/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society