One-step Domain Adaptation Approach with Partial Label
- Publisher:
- IEEE
- Publication Type:
- Conference Proceeding
- Citation:
- 2023 International Joint Conference on Neural Networks (IJCNN), 2023, 2023-June, pp. 1-8
- Issue Date:
- 2023-01-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
One-step_Domain_Adaptation_Approach_with_Partial_Label.pdf | Accepted version | 1.21 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Unsupervised Domain adaptation UDA aims to train a target classifier by using massive accurate annotated source domain data and unlabeled target domain data However collecting massive accurate annotation can be labor intensive and even impractical especially when source domain training data shows label ambiguity In this paper we consider a novel domain adaptation setting where the model can be trained with partial labeled source domain data so that the cost of data labeling can be reduced To alleviate the ambiguity induced by partial label we propose a one step domain adaptation approach trained from the partial labeled source data and unlabeled target data Our approach consists of two components a feature extractor equipped with partial label loss that learns discriminative representation and a domain classifier that learns domain invariant representation across the source and target domains Extensive experiments have shown that the proposed approach significantly outperforms a series of competitive baselines
Please use this identifier to cite or link to this item: