Evolutionary lazy learning for Naive Bayes classification

Publication Type:
Conference Proceeding
Proceedings of the International Joint Conference on Neural Networks, 2016, 2016-October pp. 3124 - 3129
Issue Date:
Full metadata record
© 2016 IEEE. Most improvements for Naive Bayes (NB) have a common yet important flaw - these algorithms split the modeling of the classifier into two separate stages - the stage of preprocessing (e.g., feature selection and data expansion) and the stage of building the NB classifier. The first stage does not take the NB's objective function into consideration, so the performance of the classification cannot be guaranteed. Motivated by these facts and aiming to improve NB with accurate classification, we present a new learning algorithm called Evolutionary Local Instance Weighted Naive Bayes or ELWNB, to extend NB for classification. ELWNB combines local NB, instance weighted dataset extension and evolutionary algorithms seamlessly. Experiments on 20 UCI benchmark datasets demonstrate that ELWNB significantly outperforms NB and several other improved NB algorithms.
Please use this identifier to cite or link to this item: