A Strategy for Attributes Selection in Cost-Sensitive Decision Trees Induction

Publisher:
IEEE Computer Society
Publication Type:
Conference Proceeding
Citation:
CIT Workshops 2008, 2008, pp. 8 - 13
Issue Date:
2008-01
Filename Description Size
Thumbnail2008001369OK.pdf1.14 MB
Adobe PDF
Full metadata record
Decision tree learning is one of the most widely used and practical methods for inductive inference. A fundamental issue in decision tree inductive learning is the attribute selection measure at each non-terminal node of the tree. However, existing literatures have not taken both classification ability and cost-sensitive into account well. In this paper, we present a new strategy for attributes selection, which is a trade-off method between attributesâ information and cost-sensitive learning including misclassification costs and test costs with different units, for selecting splitting attributes in cost-sensitive decision trees induction. The experimental results show our method outperform than the existing methods, such as, information gain method, total costs methods, in terms of the decrease of misclassification costs with different missing rate and various costs in UCI datasets.
Please use this identifier to cite or link to this item: