Collaborative Contrastive Refining for Weakly Supervised Person Search.
- Publisher:
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Publication Type:
- Journal Article
- Citation:
- IEEE Trans Image Process, 2023, 32, pp. 4951-4963
- Issue Date:
- 2023
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Collaborative Contrastive Refining for Weakly Supervised Person Search_published.pdf | Published version | 2.25 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Weakly supervised person search involves training a model with only bounding box annotations, without human-annotated identities. Clustering algorithms are commonly used to assign pseudo-labels to facilitate this task. However, inaccurate pseudo-labels and imbalanced identity distributions can result in severe label and sample noise. In this work, we propose a novel Collaborative Contrastive Refining (CCR) weakly-supervised framework for person search that jointly refines pseudo-labels and the sample-learning process with different contrastive strategies. Specifically, we adopt a hybrid contrastive strategy that leverages both visual and context clues to refine pseudo-labels, and leverage the sample-mining and noise-contrastive strategy to reduce the negative impact of imbalanced distributions by distinguishing positive samples and noise samples. Our method brings two main advantages: 1) it facilitates better clustering results for refining pseudo-labels by exploring the hybrid similarity; 2) it is better at distinguishing query samples and noise samples for refining the sample-learning process. Extensive experiments demonstrate the superiority of our approach over the state-of-the-art weakly supervised methods by a large margin (more than 3% mAP on CUHK-SYSU). Moreover, by leveraging more diverse unlabeled data, our method achieves comparable or even better performance than the state-of-the-art supervised methods.
Please use this identifier to cite or link to this item: