Continual Learning for Temporal-Sensitive Question Answering

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2024 International Joint Conference on Neural Networks (IJCNN), 2024, 00, pp. 1-9
Issue Date:
2024-09-09
Filename Description Size
2407.12470v1.pdfPublished version1.54 MB
Adobe PDF
Full metadata record
In this study we explore an emerging research area of Continual Learning for Temporal Sensitive Question Answering CLTSQA Previous research has primarily focused on Temporal Sensitive Question Answering TSQA often overlooking the unpredictable nature of future events In real world applications it s crucial for models to continually acquire knowledge over time rather than relying on a static complete dataset Our paper investigates strategies that enable models to adapt to the ever evolving information landscape thereby addressing the challenges inherent in CLTSQA To support our research we first create a novel dataset divided into five subsets designed specifically for various stages of continual learning We then propose a training framework for CLTSQA that integrates temporal memory replay and temporal contrastive learning Our experimental results highlight two significant insights First the CLTSQA task introduces unique challenges for existing models Second our proposed framework effectively navigates these challenges resulting in improved performance
Please use this identifier to cite or link to this item: