A Reduced-Space Algorithm for Minimizing $\ell_1$-Regularized Convex Functions
Abstract: We present a new method for minimizing the sum of a differentiable convex function and an $\ell_1$-norm regularizer. The main features of the new method include: $(i)$ an evolving set of indices corresponding to variables that are predicted to be nonzero at a solution (i.e., the support); $(ii)$ a reduced-space subproblem defined in terms of the predicted support; $(iii)$ conditions that determine how accurately each subproblem must be solved, which allow for Newton, Newton-CG, and coordinate-descent techniques to be employed; $(iv)$ a computationally practical condition that determines when the predicted support should be updated; and $(v)$ a reduced proximal gradient step that ensures sufficient decrease in the objective function when it is decided that variables should be added to the predicted support. We prove a convergence guarantee for our method and demonstrate its efficiency on a large set of model prediction problems.
Keywords: nonlinear optimization, convex optimization, sparse optimization, active-set methods, reduced-space methods, subspace minimization, model prediction
Category 1: Convex and Nonsmooth Optimization (Convex Optimization )
Citation: Johns Hopkins University, Applied Mathematics and Statistics, Baltimore, MD, Technical Report OPT-2016/2.
Entry Submitted: 02/19/2016
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|