A family of simple non-parametric Kernel learning algorithms

Publication Type:
Journal Article
Citation:
Journal of Machine Learning Research, 2011, 12 pp. 1313 - 1347
Issue Date:
2011-04-01
Filename Description Size
Thumbnail2013003705OK.pdf808.75 kB
Adobe PDF
Full metadata record
Previous studies of Non-Parametric Kernel Learning (NPKL) usually formulate the learning task as a Semi-Definite Programming (SDP) problem that is often solved by some general purpose SDP solvers. However, for TV data examples, the time complexity of NPKL using a standard interior-point SDP solver could be as high as 0(N6.5), which prohibits NPKL methods applicable to real applications, even for data sets of moderate size. In this paper, we present a family of efficient NPKL algorithms, termed "SimpleNPKL", which can learn non-parametric kernels from a large set of pairwise constraints efficiently. In particular, we propose two efficient SimpleNPKL algorithms. One is SimpleNPKL algorithm with linear loss, which enjoys a closed-form solution that can be efficiently computed by the Lanczos sparse eigen decomposition technique. Another one is SimpleNPKL algorithm with other loss functions (including square hinge loss, hinge loss, square loss) that can be re-formulated as a saddle-point optimization problem, which can be further resolved by a fast iterative algorithm. In contrast to the previous NPKL approaches, our empirical results show that the proposed new technique, maintaining the same accuracy, is significantly more efficient and scalable. Finally, we also demonstrate that the proposed new technique is also applicable to speed up many kernel learning tasks, including colored maximum variance unfolding, minimum volume embedding, and structure preserving embedding. © 2011 Jinfeng Zhuang, Ivor W. Tsang and Steven C.H. Hoi.
Please use this identifier to cite or link to this item: