Voting from Nearest Tasks: Meta-Vote Pruning of Pre-trained Models for Downstream Tasks

Publisher:
Springer Nature
Publication Type:
Chapter
Citation:
Machine Learning and Knowledge Discovery in Databases: Research Track, 2023, 14170 LNAI, pp. 52-68
Issue Date:
2023-01-01
Filename Description Size
978-3-031-43415-0_4.pdf838.37 kB
Adobe PDF
Full metadata record
As large-scale pre-trained models have become the major choices of various applications, new challenges arise for model pruning, e.g., can we avoid pruning the same model from scratch for downstream tasks? How to reuse the pruning results of previous tasks to accelerate the pruning for new tasks? To address these challenges, we create a small model for a new task from the pruned models of similar tasks. We show that a few fine-tuning steps on this model suffice to produce a promising pruned model for the new task. We study this “meta-pruning” from nearest tasks on two major classes of pre-trained models, convolutional neural network and vision transformer, under a limited budget of pruning iterations. Our study begins by investigating the overlap of pruned models for similar tasks and how the overlap changes over different layers and blocks. Inspired by these discoveries, we develop a simple but effective “Meta-Vote Pruning” method that significantly reduces the pruning iterations for a new task by initializing a sub-network from the pruned models of its nearest tasks. In experiments, we demonstrate MVP’s accuracy, efficiency, and generalization advantages through extensive empirical studies and comparisons with popular pruning methods over several datasets.
Please use this identifier to cite or link to this item: