Parametric Subspace Analysis for dimensionality reduction and classification
- Publication Type:
- Conference Proceeding
- 2009 IEEE Symposium on Computational Intelligence and Data Mining (CIDM '09) Proceedings, 2009, pp. 363 - 366
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Principal Components Analysis (PCA) and Linear Discriminant Analysis (LDA) are the two popular techniques in the context of dimensionality reduction and classification. By extracting discriminant features, LDA is optimal when the distributions of the features for each class are unimodal and separated by the scatter of means. On the other hand, PCA extract descriptive features which helps itself to outperform LDA in some classification tasks and less sensitive to different training data sets. The idea of Parametric Subspace Analysis (PSA) proposed in this paper is to include a parameter for regulating the combination of PCA and LDA. By combining descriptive (of PCA) and discriminant (of LDA) features, a better performance for dimensionality reduction and classification tasks is obtained with PSA and can be seen via our experimental results.
Please use this identifier to cite or link to this item: