DocumentCode :
3703578
Title :
Exploiting feature relationships towards stable feature selection
Author :
Iman Kamkar;Sunil Kumar Gupta;Dinh Phung;Svetha Venkatesh
Author_Institution :
Centre for Pattern Recognition and Data Analytics, Deakin University, Australia
fYear :
2015
Firstpage :
1
Lastpage :
10
Abstract :
Feature selection is an important step in building predictive models for most real-world problems. One of the popular methods in feature selection is Lasso. However, it shows instability in selecting features when dealing with correlated features. In this work, we propose a new method that aims to increase the stability of Lasso by encouraging similarities between features based on their relatedness, which is captured via a feature covariance matrix. Besides modeling positive feature correlations, our method can also identify negative correlations between features. We propose a convex formulation for our model along with an alternating optimization algorithm that can learn the weights of the features as well as the relationship between them. Using both synthetic and real-world data, we show that the proposed method is more stable than Lasso and many state-of-the-art shrinkage and feature selection methods. Also, its predictive performance is comparable to other methods.
Keywords :
"Stability criteria","Optimization","Covariance matrices","Correlation","Linear programming","Predictive models"
Publisher :
ieee
Conference_Titel :
Data Science and Advanced Analytics (DSAA), 2015. 36678 2015. IEEE International Conference on
Print_ISBN :
978-1-4673-8272-4
Type :
conf
DOI :
10.1109/DSAA.2015.7344859
Filename :
7344859
Link To Document :
بازگشت