DocumentCode :
1122955
Title :
Using mutual information for selecting features in supervised neural net learning
Author :
Battiti, Roberto
Author_Institution :
Dipartimento di Matematica, Trento Univ., Italy
Volume :
5
Issue :
4
fYear :
1994
fDate :
7/1/1994 12:00:00 AM
Firstpage :
537
Lastpage :
550
Abstract :
This paper investigates the application of the mutual information criterion to evaluate a set of candidate features and to select an informative subset to be used as input data for a neural network classifier. Because the mutual information measures arbitrary dependencies between random variables, it is suitable for assessing the “information content” of features in complex classification tasks, where methods bases on linear relations (like the correlation) are prone to mistakes. The fact that the mutual information is independent of the coordinates chosen permits a robust estimation. Nonetheless, the use of the mutual information for tasks characterized by high input dimensionality requires suitable approximations because of the prohibitive demands on computation and samples. An algorithm is proposed that is based on a “greedy” selection of the features and that takes both the mutual information with respect to the output class and with respect to the already-selected features into account. Finally the results of a series of experiments are discussed
Keywords :
feature extraction; learning (artificial intelligence); neural nets; coordinate independence; feature selection; greedy selection; high input dimensionality; information content; mutual information criterion; neural network classifier; robust estimation; supervised neural net learning; Data mining; Feature extraction; Intelligent networks; Mutual information; Neural networks; Principal component analysis; Random variables; Robustness; Supervised learning; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.298224
Filename :
298224
Link To Document :
بازگشت