Constrained empirical risk minimization framework for distance metric learning

Publication Type:
Journal Article
IEEE Transactions on Neural Networks and Learning Systems, 2012, 23 (8), pp. 1194 - 1205
Issue Date:
Filename Description Size
Thumbnail2012001394OK.pdf772.55 kB
Adobe PDF
Full metadata record
Distance metric learning (DML) has received increasing attention in recent years. In this paper, we propose a constrained empirical risk minimization framework for DML. This framework enriches the state-of-the-art studies on both theoretic and algorithmic aspects. Theoretically, we comprehensively analyze the generalization by bounding the sample and the approximation errors with respect to the best model. Algorithmically, we carefully derive an optimal gradient descent by using Nesterov's method, and provide two example algorithms that utilize the logarithmic loss and the smoothed hinge loss, respectively. We evaluate the new framework on data classification and image retrieval experiments. Results show that the new framework has competitive performance compared with the representative DML algorithms, including Xing's method, large margin nearest neighbor classifier, neighborhood component analysis, and regularized metric learning. © 2012 IEEE.
Please use this identifier to cite or link to this item: