An Empirical Study of Fuzzy Decision Tree for Gradient Boosting Ensemble

Publisher:
SPRINGER INTERNATIONAL PUBLISHING AG
Publication Type:
Conference Proceeding
Citation:
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2022, 13151 LNAI, pp. 716-727
Issue Date:
2022-01-01
Full metadata record
Gradient boosting has been proved to be an effective ensemble learning paradigm to combine multiple weak learners into a strong one. However, its improved performance is still limited by decision errors caused by uncertainty. Fuzzy decision trees are designed to solve the uncertainty problems caused by the collected information’s limitation and incompleteness. This paper investigates whether the robustness of gradient boosting can be improved by using fuzzy decision trees even when the decision conditions and objectives are fuzzy. We first propose and implement a fuzzy decision tree (FDT) by referring to two widely cited fuzzy decision trees. Then we propose and implement a fuzzy gradient boosting decision tree (FGBDT), which integrates a set of FDTs as weak learners. Both the algorithms can be set as non-fuzzy algorithms by parameters. To study whether fuzzification can improve the proposed algorithms in classification tasks, we pair the algorithms with their non-fuzzy algorithms and run comparison experiments on UCI Repository datasets in the same settings. The experiments show that the fuzzy algorithms perform better than their non-fuzzy algorithms in many classical classification tasks. The code is available at github.com/ZhaoqingLiu/FuzzyTrees.
Please use this identifier to cite or link to this item: