Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification
- Publication Type:
- Journal Article
- Citation:
- International Journal of Computer Applications in Technology, 2012, 43 (4), pp. 378 - 384
- Issue Date:
- 2012-06-01
Closed Access
| Filename | Description | Size | |||
|---|---|---|---|---|---|
| ijcat.2012.047164.pdf | Published Version | 235.25 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
K-nearest-neighbour (KNN) as an important classification method has been widely used in data mining. However, the class probability estimation, the neighbourhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamic selected methods etc. In this paper, we firstly reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic KNN with distance and attribute weighted, simply DKNDAW. We experimentally tested our new algorithm in Weka system. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms other algorithms in terms of the classification accuracy. Besides, how to learn a weighted DKNDAW with accurate ranking from data, or more precisely, different attribute weighted method of DKNDAW can produce accurate ranking. We explore various methods: the gain ratio method, the correlation-based feature selection method, and the decision tree-based method. We concluded that the gain ratio method is more suitable for our improved KNN algorithm DKNDAW. Copyright © 2012 Inderscience Enterprises Ltd.
Please use this identifier to cite or link to this item:
