Title of article :
Exploiting probabilistic topic models to improve text categorization under class imbalance
Author/Authors :
Enhong Chen، نويسنده , , Yanggang Lin، نويسنده , , Hui Xiong، نويسنده , , Qiming Luo، نويسنده , , Haiping Ma، نويسنده ,
Issue Information :
دوماهنامه با شماره پیاپی سال 2011
Pages :
13
From page :
202
To page :
214
Abstract :
In text categorization, it is quite often that the numbers of documents in different categories are different, i.e., the class distribution is imbalanced. We propose a unique approach to improve text categorization under class imbalance by exploiting the semantic context in text documents. Specifically, we generate new samples of rare classes (categories with relatively small amount of training data) by using global semantic information of classes represented by probabilistic topic models. In this way, the numbers of samples in different categories can become more balanced and the performance of text categorization can be improved using this transformed data set. Indeed, the proposed method is different from traditional re-sampling methods, which try to balance the number of documents in different classes by re-sampling the documents in rare classes. Such re-sampling methods can cause overfitting. Another benefit of our approach is the effective handling of noisy samples. Since all the new samples are generated by topic models, the impact of noisy samples is dramatically reduced. Finally, as demonstrated by the experimental results, the proposed methods can achieve better performance under class imbalance and is more tolerant to noisy samples.
Keywords :
Noisy data , Rare class analysis , Class imbalance , Text Categorization , Probabilistic topic model
Journal title :
Information Processing and Management
Serial Year :
2011
Journal title :
Information Processing and Management
Record number :
1229099
Link To Document :
بازگشت