Optimization Online


Scalable Algorithms for the Sparse Ridge Regression

Weijun Xie (wxie***at***vt.edu)
Xinwei Deng (xdeng***at***vt.edu)

Abstract: Sparse regression and variable selection for large-scale data have been rapidly developed in the past decades. This work focuses on sparse ridge regression, which enforces the sparsity by use of the L0 norm. We first prove that the continuous relaxation of the mixed integer second order conic (MISOC) reformulation using perspective formulation is equivalent to that of the convex integer formulation proposed in recent work. We also show that the convex hull of the constraint system of MISOC formulation is equal to its continuous relaxation. Based upon these two formulations (i.e., the MISOC formulation and convex integer formulation), we analyze two scalable algorithms, the greedy and randomized algorithms, for sparse ridge regression with desirable theoretical properties. The proposed algorithms are proved to yield near-optimal solutions under mild conditions. We further propose to integrate the greedy algorithm with the randomized algorithm, which can greedily search the features from the nonzero subset identified by the continuous relaxation of the MISOC formulation. The merits of the proposed methods are illustrated through numerical examples in comparison with several existing ones.

Keywords: Ridge Regression, Chance Constraint, Mixed Integer, Conic Program, Approximation Algorithm

Category 1: Applications -- Science and Engineering (Data-Mining )

Category 2: Integer Programming

Category 3: Combinatorial Optimization (Approximation Algorithms )

Citation: submitted

Download: [PDF]

Entry Submitted: 06/08/2018
Entry Accepted: 06/09/2018
Entry Last Modified: 06/28/2020

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society