DocumentCode :
3353310
Title :
MSMOTE: Improving Classification Performance When Training Data is Imbalanced
Author :
Hu, Shengguo ; Liang, Yanfeng ; Ma, Lintao ; He, Ying
Author_Institution :
Etsong Tobacco (Group) Ltd., Qingdao, China
Volume :
2
fYear :
2009
fDate :
28-30 Oct. 2009
Firstpage :
13
Lastpage :
17
Abstract :
Learning from data sets that contain very few instances of the minority class usually produces biased classifiers that have a higher predictive accuracy over the majority class, but poorer predictive accuracy over the minority class. SMOTE (synthetic minority over-sampling technique) is specifically designed for learning from imbalanced data sets. This paper presents a modified approach (MSMOTE) for learning from imbalanced data sets, based on the SMOTE algorithm. MSMOTE not only considers the distribution of minority class samples, but also eliminates noise samples by adaptive mediation. The combination of MSMOTE and AdaBoost are applied to several highly and moderately imbalanced data sets. The experimental results show that the prediction performance of MSMOTE is better than SMOTEBoost in the minority class and F-values are also improved.
Keywords :
Ada; data mining; learning (artificial intelligence); pattern classification; sampling methods; AdaBoost; MSMOTE; biased classifiers; classification performance; learning; modified synthetic minority over-sampling technique; Accuracy; Boosting; Classification algorithms; Computer science; Data engineering; Intrusion detection; Machine learning algorithms; Oceans; Sampling methods; Training data; AdaBoost; SMOTE; SMOTEBoost; imbalanced data; over-sampling; samples groups;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Science and Engineering, 2009. WCSE '09. Second International Workshop on
Conference_Location :
Qingdao
Print_ISBN :
978-0-7695-3881-5
Type :
conf
DOI :
10.1109/WCSE.2009.756
Filename :
5403368
Link To Document :
بازگشت