Parametric Subspace Analysis for dimensionality reduction and classification

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2009 IEEE Symposium on Computational Intelligence and Data Mining (CIDM '09) Proceedings, 2009, pp. 363 - 366
Issue Date:
2009-01
Filename Description Size
Thumbnail2009003661OK.pdf810.38 kB
Adobe PDF
Full metadata record
Principal Components Analysis (PCA) and Linear Discriminant Analysis (LDA) are the two popular techniques in the context of dimensionality reduction and classification. By extracting discriminant features, LDA is optimal when the distributions of the features for each class are unimodal and separated by the scatter of means. On the other hand, PCA extract descriptive features which helps itself to outperform LDA in some classification tasks and less sensitive to different training data sets. The idea of Parametric Subspace Analysis (PSA) proposed in this paper is to include a parameter for regulating the combination of PCA and LDA. By combining descriptive (of PCA) and discriminant (of LDA) features, a better performance for dimensionality reduction and classification tasks is obtained with PSA and can be seen via our experimental results.
Please use this identifier to cite or link to this item: