General Averaged Divergence Analysis

Publisher:
IEEE Computer Society
Publication Type:
Conference Proceeding
Citation:
Proceedings of the Seventh IEEE International Conference on Data Mining, 2007, pp. 302 - 311
Issue Date:
2007-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2011001849OK.pdf984.38 kB
Adobe PDF
Subspace selection is a powerful tool in data mining. An important subspace method is the FisherâRao linear discriminant analysis (LDA), which has been successfully applied in many fields such as biometrics, bioinformatics, and multimedia retrieval. However, LDA has a critical drawback: the projection to a subspace tends to merge those classes that are close together in the original feature space. If the separated classes are sampled from Gaussian distributions, all with identical covariance matrices, then LDA maximizes the mean value of the KullbackâLeibler (KL) divergences between the different classes. We generalize this point of view to obtain a framework for choosing a subspace by 1) generalizing the KL divergence to the Bregman divergence and 2) generalizing the arithmetic mean to a general mean. The framework is named the general averaged divergence analysis (GADA). Under this GADA framework, a geometric mean divergence analysis (GMDA) method based on the geometric mean is studied. A large number of experiments based on synthetic data show that our method significantly outperforms LDA and several representative LDA extensions.
Please use this identifier to cite or link to this item: