Counterfactual Generation Framework for Few-Shot Learning
- Publisher:
- Institute of Electrical and Electronics Engineers (IEEE)
- Publication Type:
- Journal Article
- Citation:
- IEEE Transactions on Circuits and Systems for Video Technology, 2023, PP, (99), pp. 1-1
- Issue Date:
- 2023-01-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Counterfactual_Generation_Framework_for_Few-Shot_Learning.pdf | Published version | 4.17 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Few-shot learning (FSL) that aims to recognize novel classes with few labeled samples is troubled by its data scarcity. Though recent works tackle FSL with data augmentation-based methods, these models fail to maintain the discrimination and diversity of the generated samples due to the distribution shift and intra-class bias caused by the data scarcity, therefore greatly undermining the performance. To this end, we use causal mechanisms, which are constant among independent variables across data distribution, to alleviate such effects. In this sense, we decompose the image information into two independent components: sample-specific and class-agnostic information, and further propose a novel Counterfactual Generation Framework (CGF) to learn the underlying causal mechanisms to synthesize faithful samples for FSL. Specifically, based on the counterfactual inference, we design a class-agnostic feature extractor to capture the sample-specific information, together with a counterfactual generation network to simulate the data generation process from a causal perspective. Moreover, to leverage the power of CGF in counterfactual inference, we further develop a novel classifier that classifies samples based on their distributions of counterfactual generations. Extensive experiments demonstrate the effectiveness of CGF on four FSL benchmarks, e.g ., 80.12/86.13% accuracy on 5-way 1/5-shot miniImageNet FSL tasks, significantly improving the performance. Our codes and models are available at https://github.com/eric-hang/CGF.
Please use this identifier to cite or link to this item: