An in-depth comparison of methods handling mixed-attribute data for general fuzzy min–max neural network
- Publisher:
- ELSEVIER
- Publication Type:
- Journal Article
- Citation:
- Neurocomputing, 2021, 464, pp. 175-202
- Issue Date:
- 2021-11-13
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
1-s2.0-S0925231221012807-main.pdf | Published version | 2.36 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
A general fuzzy min–max (GFMM) neural network is one of the efficient neuro-fuzzy systems for classification problems. However, a disadvantage of most of the current learning algorithms for GFMM is that they can handle effectively numerical valued features only. Therefore, this paper provides some potential approaches to adapting GFMM learning algorithms for classification problems with mixed-type or only categorical features as they are very common in practical applications and often carry very useful information. We will compare and assess three main methods of handling datasets with mixed features, including the use of encoding methods, the combination of the GFMM model with other classifiers, and employing the specific learning algorithms for both types of features. The experimental results showed that the target and James–Stein are appropriate categorical encoding methods for learning algorithms of GFMM models, while the combination of GFMM neural networks and decision trees is a flexible way to enhance the classification performance of GFMM models on datasets with the mixed features. The learning algorithms with the mixed-type feature abilities are potential approaches to deal with mixed-attribute data in a natural way, but they need further improvement to achieve a better classification accuracy. Based on the analysis, we also identify the strong and weak points of different methods and propose potential research directions.
Please use this identifier to cite or link to this item: