Fine-Grained Distillation for Long Document Retrieval
- Publisher:
- ASSOC ADVANCEMENT ARTIFICIAL INTELLIGENCE
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings of the AAAI Conference on Artificial Intelligence, 2024, 38, (17), pp. 19732-19740
- Issue Date:
- 2024-03-25
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Long document retrieval aims to fetch query-relevant documents from a large-scale collection, where knowledge distillation has become de facto to improve a retriever by mimicking a heterogeneous yet powerful cross-encoder. However, in contrast to passages or sentences, retrieval on long documents suffers from the scope hypothesis that a long document may cover multiple topics. This maximizes their structure heterogeneity and poses a granular-mismatch issue, leading to an inferior distillation efficacy. In this work, we propose a new learning framework, fine-grained distillation (FGD), for long-document retrievers. While preserving the conventional dense retrieval paradigm, it first produces global-consistent representations crossing different fine granularity and then applies multi-granular aligned distillation merely during training. In experiments, we evaluate our framework on two long-document retrieval benchmarks, which show state-of-the-art performance.
Please use this identifier to cite or link to this item:
