DocumentCode :
2335501
Title :
Creating ensembles of classifiers
Author :
Chawla, Nitesh ; Eschrich, Steven ; Hall, Lawrence O.
Author_Institution :
Dept. of Comput. Sci. & Eng., Univ. of South Florida, Tampa, FL, USA
fYear :
2001
fDate :
2001
Firstpage :
580
Lastpage :
581
Abstract :
Ensembles of classifiers offer promise in increasing overall classification accuracy. The availability of extremely large datasets has opened avenues for application of distributed and/or parallel learning to efficiently learn models of them. In this paper, distributed learning is done by training classifiers on disjoint subsets of the data. We examine a random partitioning method to create disjoint subsets and propose a more intelligent way of partitioning into disjoint subsets using clustering. It was observed that the intelligent method of partitioning generally performs better than random partitioning for our datasets. In both methods a significant gain in accuracy may be obtained by applying bagging to each of the disjoint subsets, creating multiple diverse classifiers. The significance of our finding is that a partition strategy for even small/moderate sized datasets when combined with bagging can yield better performance than applying a single learner using the entire dataset
Keywords :
data mining; learning (artificial intelligence); pattern clustering; classification accuracy; classifier ensemble creation; clustering; disjoint subsets; distributed learning; large datasets; parallel learning; random partitioning method; Application software; Bagging; Classification tree analysis; Clustering algorithms; Computer science; Decision trees; Distributed computing; Machine learning; Partitioning algorithms; Tires;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Mining, 2001. ICDM 2001, Proceedings IEEE International Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
0-7695-1119-8
Type :
conf
DOI :
10.1109/ICDM.2001.989568
Filename :
989568
Link To Document :
بازگشت