An Empirical Study of Bagging Predictors for Imbalanced Data with Different Levels of Class Distribution

Publisher:
Springer-Verlag Berlin / Heidelberg
Publication Type:
Conference Proceeding
Citation:
AI 2011: Advances in Artificial Intelligence, 2011, pp. 213 - 222
Issue Date:
2011-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2010005258OK.pdf5.17 MB
Adobe PDF
Research into learning from imbalanced data has increasingly captured the attention of both academia and industry, especially when the class distribution is highly skewed. This paper compares the Area Under the Receiver Operating Characteristic Curve (AUC) performance of bagging in the context of learning from different imbalanced levels of class distribution. Despite the popularity of bagging in many real-world applications, some questions have not been clearly answered in the existing research, e.g., which bagging predictors may achieve the best performance for applications, and whether bagging is superior to single learners when the levels of class distribution change. We perform a comprehensive evaluation of the AUC performance of bagging predictors with 12 base learners at different imbalanced levels of class distribution by using a sampling technique on 14 imbalanced data-sets. Our experimental results indicate that Decision Table (DTable) and RepTree are the learning algorithms with the best bagging AUC performance. Most AUC performances of bagging predictors are statistically superior to single learners, except for Support Vector Machines (SVM) and Decision Stump (DStump).
Please use this identifier to cite or link to this item: