A Strategy for Attributes Selection in Cost-Sensitive Decision Trees Induction
- IEEE Computer Society
- Publication Type:
- Conference Proceeding
- CIT Workshops 2008, 2008, pp. 8 - 13
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Decision tree learning is one of the most widely used and practical methods for inductive inference. A fundamental issue in decision tree inductive learning is the attribute selection measure at each non-terminal node of the tree. However, existing literatures have not taken both classification ability and cost-sensitive into account well. In this paper, we present a new strategy for attributes selection, which is a trade-off method between attributesâ information and cost-sensitive learning including misclassification costs and test costs with different units, for selecting splitting attributes in cost-sensitive decision trees induction. The experimental results show our method outperform than the existing methods, such as, information gain method, total costs methods, in terms of the decrease of misclassification costs with different missing rate and various costs in UCI datasets.
Please use this identifier to cite or link to this item: