How Do Large Language Models Capture the Ever-changing World Knowledge? A Review of Recent Advances

Publication Type:
Conference Proceeding
Citation:
EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings, 2023, pp. 8289-8311
Issue Date:
2023-01-01
Filename Description Size
2023.emnlp-main.516.pdfPublished version647.35 kB
Adobe PDF
Full metadata record
Although large language models (LLMs) are impressive in solving various tasks, they can quickly be outdated after deployment. Maintaining their up-to-date status is a pressing concern in the current era. This paper provides a comprehensive review of recent advances in aligning LLMs with the ever-changing world knowledge without re-training from scratch. We categorize research works systemically and provide in-depth comparisons and discussion. We also discuss existing challenges and highlight future directions to facilitate research in this field.
Please use this identifier to cite or link to this item: