Cost-sensitive classification with k-nearest neighbors

Publisher:
Springer
Publication Type:
Journal Article
Citation:
Lecture Notes in Computer Science, 2013, 8041 pp. 112 - 131
Issue Date:
2013-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2013004344OK.pdf262.09 kB
Adobe PDF
Cost-sensitive learning algorithms are typically motivated by imbalance data in clinical diagnosis that contains skewed class distribution. While other popular classification methods have been improved against imbalance data, it is only unsolved to extend k-Nearest Neighbors (kNN) classification, one of top-10 datamining algorithms, to make it cost-sensitive to imbalance data. To fill in this gap, in this paper we study two simple yet effective cost-sensitive kNN classification approaches, called Direct-CS-kNN and Distance-CS-kNN. In addition, we utilize several strategies (i.e., smoothing, minimum-cost k value selection, feature selection and ensemble selection) to improve the performance of Direct-CS-kNN and Distance-CS-kNN. We conduct several groups of experiments to evaluate the efficiency with UCI datasets, and demonstrate that the proposed cost-sensitive kNN classification algorithms can significantly reduce misclassification cost, often by a large margin, as well as consistently outperform CS-4.5 with/without additional enhancements.
Please use this identifier to cite or link to this item: