DocumentCode :
1797659
Title :
The delta test: The 1-NN estimator as a feature selection criterion
Author :
Eirola, Emil ; Lendasse, Amaury ; Corona, Fabio ; Verleysen, Michel
Author_Institution :
Dept. of Inf. & Comput. Sci., Aalto Univ., Aalto, Finland
fYear :
2014
fDate :
6-11 July 2014
Firstpage :
4214
Lastpage :
4222
Abstract :
Feature selection is essential in many machine learning problem, but it is often not clear on which grounds variables should be included or excluded. This paper shows that the mean squared leave-one-out error of the first-nearest-neighbour estimator is effective as a cost function when selecting input variables for regression tasks. A theoretical analysis of the estimator´s properties is presented to support its use for feature selection. An experimental comparison to alternative selection criteria (including mutual information, least angle regression, and the RReliefF algorithm) demonstrates reliable performance on several regression tasks.
Keywords :
learning (artificial intelligence); regression analysis; statistical testing; 1-NN estimator; RReliefF algorithm; delta test; feature selection criterion; first-nearest-neighbour estimator; input variables selection; least angle regression; machine learning problems; mean squared leave-one-out error; mutual information; regression task; Cost function; Data models; Input variables; Measurement; Mutual information; Noise; Reliability;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
Type :
conf
DOI :
10.1109/IJCNN.2014.6889560
Filename :
6889560
Link To Document :
بازگشت