Kernel learning and optimization with Hilbert–Schmidt independence criterion

Publication Type:
Journal Article
Citation:
International Journal of Machine Learning and Cybernetics, 2018, 9 (10), pp. 1707 - 1717
Issue Date:
2018-10-01
Filename Description Size
Wang-Li2018_Article_KernelLearningAndOptimizationW.pdfPublished Version1.28 MB
Adobe PDF
Full metadata record
© 2017, Springer-Verlag Berlin Heidelberg. Measures of statistical dependence between random variables have been successfully applied in many machine learning tasks, such as independent component analysis, feature selection, clustering and dimensionality reduction. The success is based on the fact that many existing learning tasks can be cast into problems of dependence maximization (or minimization). Motivated by this, we present a unifying view of kernel learning via statistical dependence estimation. The key idea is that good kernels should maximize the statistical dependence between the kernels and the class labels. The dependence is measured by the Hilbert–Schmidt independence criterion (HSIC), which is based on computing the Hilbert–Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces and is traditionally used to measure the statistical dependence between random variables. As a special case of kernel learning, we propose a Gaussian kernel optimization method for classification by maximizing the HSIC, where two forms of Gaussian kernels (spherical kernel and ellipsoidal kernel) are considered. Extensive experiments on real-world data sets from UCI benchmark repository validate the superiority of the proposed approach in terms of both prediction accuracy and computational efficiency.
Please use this identifier to cite or link to this item: