Feature subset selection using differential evolution

Publication Type:
Chapter
Citation:
2009, 5506 LNCS pp. 103 - 110
Issue Date:
2009-09-21
Filename Description Size
Thumbnail2009003037OK.pdf459.92 kB
Adobe PDF
Full metadata record
One of the fundamental motivations for feature selection is to overcome the curse of dimensionality. A novel feature selection algorithm is developed in this chapter based on a combination of Differential Evolution (DE) optimization technique and statistical feature distribution measures. The new algorithm, referred to as DEFS, utilizes the DE float number optimizer in a combinatorial optimization problem like feature selection. The proposed DEFS highly reduces the computational cost while at the same time proves to present a powerful performance. The DEFS is tested as a search procedure on different datasets with varying dimensionality. Practical results indicate the significance of the proposed DEFS in terms of solutions optimality and memory requirements. © 2009 Springer Berlin Heidelberg.
Please use this identifier to cite or link to this item: