Unsupervised feature selection using nonnegative spectral analysis

Publication Type:
Conference Proceeding
Citation:
Proceedings of the National Conference on Artificial Intelligence, 2012, 2 pp. 1026 - 1032
Issue Date:
2012-11-07
Filename Description Size
unsupervised.pdfPublished version846.61 kB
Adobe PDF
Full metadata record
In this paper, a new unsupervised learning algorithm, namely Nonnegative Discriminative Feature Selection (NDFS), is proposed. To exploit the discriminative information in unsupervised scenarios, we perform spectral clustering to learn the cluster labels of the input samples, during which the feature selection is performed simultaneously. The joint learning of the cluster labels and feature selection matrix enables NDFS to select the most discriminative features. To learn more accurate cluster labels, a nonnegative constraint is explicitly imposed to the class indicators. To reduce the redundant or even noisy features, ℓ 2,1-norm minimization constraint is added into the objective function, which guarantees the feature selection matrix sparse in rows. Our algorithm exploits the discriminative information and feature correlation simultaneously to select a better feature subset. A simple yet efficient iterative algorithm is designed to optimize the proposed objective function. Experimental results on different real world datasets demonstrate the encouraging performance of our algorithm over the state-of-the-arts. Copyright © 2012, Association for the Advancement of Artificial Intelligence. All rights reserved.
Please use this identifier to cite or link to this item: