Handling over-fitting in test cost-sensitive decision tree learning by feature selection, smoothing and pruning
- Publication Type:
- Journal Article
- Citation:
- Journal of Systems and Software, 2010, 83 (7), pp. 1137 - 1147
- Issue Date:
- 2010-07-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
2009007504OK.pdf | 256.92 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Cost-sensitive learning algorithms are typically designed for minimizing the total cost when multiple costs are taken into account. Like other learning algorithms, cost-sensitive learning algorithms must face a significant challenge, over-fitting, in an applied context of cost-sensitive learning. Specifically speaking, they can generate good results on training data but normally do not produce an optimal model when applied to unseen data in real world applications. It is called data over-fitting. This paper deals with the issue of data over-fitting by designing three simple and efficient strategies, feature selection, smoothing and threshold pruning, against the TCSDT (test cost-sensitive decision tree) method. The feature selection approach is used to pre-process the data set before applying the TCSDT algorithm. The smoothing and threshold pruning are used in a TCSDT algorithm before calculating the class probability estimate for each decision tree leaf. To evaluate our approaches, we conduct extensive experiments on the selected UCI data sets across different cost ratios, and on a real world data set, KDD-98 with real misclassification cost. The experimental results show that our algorithms outperform both the original TCSDT and other competing algorithms on reducing data over-fitting. © 2010 Elsevier Inc. All rights reserved.
Please use this identifier to cite or link to this item: