  


Stochastic Discrete Firstorder Algorithm for Feature Subset Selection
Kota Kudo(s1920464s.tsukuba.ac.jp) Abstract: This paper addresses the problem of selecting a significant subset of candidate features to use for multiple linear regression. Bertsimas et al. (2016) recently proposed the discrete firstorder (DFO) algorithm to efficiently find nearoptimal solutions to this problem. However, this algorithm is unable to escape from locally optimal solutions. To resolve this, we propose a stochastic discrete firstorder (SDFO) algorithm for feature subset selection. In this algorithm, random perturbations are added to a sequence of candidate solutions as a means to escape from locally optimal solutions, which broadens the range of discoverable solutions. Moreover, we derive the optimal step size in the gradientdescent direction to accelerate convergence of the algorithm. We also make effective use of the L2regularization term to improve the predictive performance of a resultant subset regression model. The simulation results demonstrate that our algorithm substantially outperforms the original DFO algorithm. Our algorithm was superior in predictive performance to lasso and forward stepwise selection as well. Keywords: feature subset selection, optimization algorithm, linear regression, machine learning, statistics Category 1: Applications  Science and Engineering (Statistics ) Category 2: Integer Programming Citation: Download: [PDF] Entry Submitted: 10/09/2019 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  