Eigenfunction-Based Multitask Learning in a Reproducing Kernel Hilbert Space

Publication Type:
Journal Article
Citation:
IEEE Transactions on Neural Networks and Learning Systems, 2019, 30 (6), pp. 1818 - 1830
Issue Date:
2019-06-01
Filename Description Size
08513983.pdfPublished Version1.77 MB
Adobe PDF
Full metadata record
© 2018 IEEE. Multitask learning aims to improve the performance on related tasks by exploring the interdependence among them. Existing multitask learning methods explore the relatedness among tasks on the basis of the input features and the model parameters. In this paper, we focus on nonparametric multitask learning and propose to measure task relatedness from a novel perspective in a reproducing kernel Hilbert space (RKHS). Past works have shown that the objective function for a given task can be approximated using the top eigenvalues and corresponding eigenfunctions of a predefined integral operator on an RKHS. In our method, we formulate our objective for multitask learning as a linear combination of two sets of eigenfunctions, common eigenfunctions shared by different tasks and unique eigenfunctions in individual tasks, such that the eigenfunctions for one task can provide additional information on another and help to improve its performance. We present both theoretical and empirical validations of our proposed approach. The theoretical analysis demonstrates that our learning algorithm is uniformly argument stable and that the convergence rate of the generalization upper bound can be improved by learning multiple tasks. Experiments on several benchmark multitask learning data sets show that our method yields promising results.
Please use this identifier to cite or link to this item: