Transfer Independently Together: A Generalized Framework for Domain Adaptation
- Publication Type:
- Journal Article
- Citation:
- IEEE Transactions on Cybernetics, 2019, 49 (6), pp. 2144 - 2155
- Issue Date:
- 2019-06-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
08337102.pdf | Published Version | 2.46 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2013 IEEE. Currently, unsupervised heterogeneous domain adaptation in a generalized setting, which is the most common scenario in real-world applications, is under insufficient exploration. Existing approaches either are limited to special cases or require labeled target samples for training. This paper aims to overcome these limitations by proposing a generalized framework, named as transfer independently together (TIT). Specifically, we learn multiple transformations, one for each domain (independently), to map data onto a shared latent space, where the domains are well aligned. The multiple transformations are jointly optimized in a unified framework (together) by an effective formulation. In addition, to learn robust transformations, we further propose a novel landmark selection algorithm to reweight samples, i.e., increase the weight of pivot samples and decrease the weight of outliers. Our landmark selection is based on graph optimization. It focuses on sample geometric relationship rather than sample features. As a result, by abstracting feature vectors to graph vertices, only a simple and fast integer arithmetic is involved in our algorithm instead of matrix operations with float point arithmetic in existing approaches. At last, we effectively optimize our objective via a dimensionality reduction procedure. TIT is applicable to arbitrary sample dimensionality and does not need labeled target samples for training. Extensive evaluations on several standard benchmarks and large-scale datasets of image classification, text categorization and text-to-image recognition verify the superiority of our approach.
Please use this identifier to cite or link to this item: