Feature Subset Selection Using Differential Evolution

Publication Type:
Advances in Neuro-Information Processing - Lecture Notes in Computer Science, 2009, First Edition, pp. 103 - 110
Issue Date:
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2009003037OK.pdf459.92 kB
Adobe PDF
One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements.
Please use this identifier to cite or link to this item: