Title :
Effective pruning of neural network classifier ensembles
Author :
Lazarevic, Aleksandar ; Obradovic, Zoran
Author_Institution :
Center for Inf. Sci. & Technol., Temple Univ., Philadelphia, PA, USA
Abstract :
Neural network ensemble techniques have been shown to be very accurate classification techniques. However, in some real-life applications a number of classifiers required to achieve a reasonable accuracy is enormously large and hence very space consuming. The paper proposes several methods for pruning neural network ensembles. The clustering based approach applies k-means clustering to entire set of classifiers in order to identify the groups of similar classifiers and then eliminates redundant classifiers inside each cluster. Another proposed approach contains the sequence of the depth-first building the tree of classifiers according to their diversity followed by the process of tree pruning. The proposed methods applied to several data sets have shown that by selecting an optimal subset of neural network classifiers, it is possible to obtain significantly smaller ensemble of classifiers while achieving the same or even slightly better generalizability as when using the entire ensemble
Keywords :
generalisation (artificial intelligence); neural nets; pattern classification; pattern clustering; clustering based approach; generalizability; k-means clustering; neural network classifier ensembles; tree pruning; Accuracy; Bagging; Boosting; Classification tree analysis; Decision trees; Design methodology; Diversity reception; Information science; Machine learning; Neural networks;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939461