Counterfactual Generation Framework for Few-Shot Learning

Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Journal Article
IEEE Transactions on Circuits and Systems for Video Technology, 2023, PP, (99), pp. 1-1
Issue Date:
Filename Description Size
Counterfactual_Generation_Framework_for_Few-Shot_Learning.pdfPublished version4.17 MB
Adobe PDF
Full metadata record
Few-shot learning (FSL) that aims to recognize novel classes with few labeled samples is troubled by its data scarcity. Though recent works tackle FSL with data augmentation-based methods, these models fail to maintain the discrimination and diversity of the generated samples due to the distribution shift and intra-class bias caused by the data scarcity, therefore greatly undermining the performance. To this end, we use causal mechanisms, which are constant among independent variables across data distribution, to alleviate such effects. In this sense, we decompose the image information into two independent components: sample-specific and class-agnostic information, and further propose a novel Counterfactual Generation Framework (CGF) to learn the underlying causal mechanisms to synthesize faithful samples for FSL. Specifically, based on the counterfactual inference, we design a class-agnostic feature extractor to capture the sample-specific information, together with a counterfactual generation network to simulate the data generation process from a causal perspective. Moreover, to leverage the power of CGF in counterfactual inference, we further develop a novel classifier that classifies samples based on their distributions of counterfactual generations. Extensive experiments demonstrate the effectiveness of CGF on four FSL benchmarks, e.g., 80.12/86.13% accuracy on 5-way 1/5-shot miniImageNet FSL tasks, significantly improving the performance. Our codes and models are available at
Please use this identifier to cite or link to this item: