POSITIVE UNLABELED LEARNING BY SEMI-SUPERVISED LEARNING
- Publisher:
- Institute of Electrical and Electronics Engineers (IEEE)
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings - International Conference on Image Processing, ICIP, 2022, 00, pp. 2976-2980
- Issue Date:
- 2022-01-01
Closed Access
| Filename | Description | Size | |||
|---|---|---|---|---|---|
| Positive_Unlabeled_Learning_by_Semi-Supervised_Learning.pdf | Published version | 1.07 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Positive and Unlabeled learning (PU learning) trains a binary classifier based on only positive (P) and unlabeled (U) data, where the unlabeled data contains positive or negative samples. Previous importance reweighting approaches treat all unlabeled samples as weighted negative samples, achieving state-of-the-art performance. However, in this paper, we surprisingly find that the classifier could misclassify negative samples in U data as positive ones at the late training stage by weight adjustment. Motivated by this discovery, we leverage Semi-Supervised Learning (SSL) to address this performance degradation problem. To this end, we propose a novel SSL-based framework to tackle PU learning. Firstly, we introduce the dynamic increasing sampling strategy to progressively select both negative and positive samples from U data. Secondly, we adopt MixMatch to take full advantage of the unchosen samples in U data. Finally, we propose the Co-learning strategy that iteratively trains two independent networks with the selected samples to avoid the confirmation bias. Experimental results on four benchmark datasets demonstrate the effectiveness and superiority of our approach when compared with other state-of-the-art methods.
Please use this identifier to cite or link to this item:
