Lagrangian relaxation for SVM feature selection

We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in a standard L1 norm Support Vector Machine (SVM) framework for binary classification. The feature selection model we adopt is a Mixed Binary Linear Programming problem and it is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets characterized by a high number of features and a relatively small number of samples.

Citation

November 2015

Article

Download

View Lagrangian relaxation for SVM feature selection