DocumentCode :
856710
Title :
Fast generic selection of features for neural network classifiers
Author :
Brill, Frank Z. ; Brown, Donald E. ; Martin, Worthy N.
Author_Institution :
Inst. for Parallel Comput., Virginia Univ., Charlottesville, VA, USA
Volume :
3
Issue :
2
fYear :
1992
fDate :
3/1/1992 12:00:00 AM
Firstpage :
324
Lastpage :
328
Abstract :
The authors describe experiments using a genetic algorithm for feature selection in the context of neural network classifiers, specifically, counterpropagation networks. They present the novel techniques used in the application of genetic algorithms. First, the genetic algorithm is configured to use an approximate evaluation in order to reduce significantly the computation required. In particular, though the desired classifiers are counterpropagation networks, they use a nearest-neighbor classifier to evaluate features sets and show that the features selected by this method are effective in the context of counterpropagation networks. Second, a method called the training set sampling in which only a portion of the training set is used on any given evaluation, is proposed. Computational savings can be made using this method, i.e., evaluations can be made over an order of magnitude faster. This method selects feature sets that are as good as and occasionally better for counterpropagation than those chosen by an evaluation that uses the entire training set
Keywords :
computerised pattern recognition; genetic algorithms; learning systems; neural nets; counterpropagation networks; feature selection; genetic algorithm; nearest-neighbor classifier; neural network classifiers; training set sampling; Concurrent computing; Degradation; Genetic algorithms; Neural networks; Pattern recognition; Propulsion; Sampling methods; Testing;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.125874
Filename :
125874
Link To Document :
بازگشت