Optimization Online


Lagrangian relaxation for SVM feature selection

M Gaudioso(manlio.gaudioso***at***unical.it)
E Gorgone(egorgone***at***ulb.ac.be)
M Labbé(mlabbe***at***ulb.ac.be)
A.M Rodriguez-Chia(antonio.rodriguezchia***at***uca.es)

Abstract: We discuss a Lagrangian-relaxation-based heuristics for dealing with feature selection in a standard L1 norm Support Vector Machine (SVM) framework for binary classification. The feature selection model we adopt is a Mixed Binary Linear Programming problem and it is suitable for a Lagrangian relaxation approach. Based on a property of the optimal multiplier setting, we apply a consolidated nonsmooth optimization ascent algorithm to solve the resulting Lagrangian dual. In the proposed approach we get, at every ascent step, both a lower bound on the optimal solution as well as a feasible solution at low computational cost. We present the results of our numerical experiments on some benchmark datasets characterized by a high number of features and a relatively small number of samples.

Keywords: SVM classification, feature selection, Lagrangian relaxation, nonsmooth optimization

Category 1: Integer Programming ((Mixed) Integer Linear Programming )

Citation: November 2015

Download: [PDF]

Entry Submitted: 11/18/2015
Entry Accepted: 11/18/2015
Entry Last Modified: 11/18/2015

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society