Soft Prompt Transfer for Zero-Shot and Few-Shot Learning in EHR Understanding

Publisher:
Springer Nature
Publication Type:
Chapter
Citation:
Advanced Data Mining and Applications, 2023, 14178 LNAI, pp. 18-32
Issue Date:
2023-01-01
Filename Description Size
978-3-031-46671-7_2.pdf418.89 kB
Adobe PDF
Full metadata record
Electronic Health Records (EHRs) are a rich source of information that can be leveraged for various medical applications, such as disease inference, treatment recommendation, and outcome analysis. However, the complexity and heterogeneity of EHR data, along with the limited availability of well-labeled samples, present significant challenges to the development of efficient and adaptable models for EHR tasks (such as rare or novel disease prediction or inference). In this paper, we propose Soft prompt transfer for Electronic Health Records (SptEHR), a novel pipeline designed to address these challenges. Specifically, SptEHR consists of three main stages: (1) self-supervised pre-training on raw EHR data for an EHR-centric transformer-based foundation model, (2) supervised multi-task continual learning from existing well-labeled tasks to further refine the foundation model and learn transferable task-specific soft prompts, and (3) further improve zero-shot and few-shot ability via prompt transfer. Specifically, the transformer-based foundation model learned from stage one captures domain-specific knowledge. Then the multi-task continual training in stage two improves model adaptability and performance on EHR tasks. Finally, stage three leverages soft prompt transfer which is based on the similarity between the new and the existing tasks, to effectively address new tasks without requiring additional/extensive training. The effectiveness of the SptEHR has been validated on the benchmark dataset - MIMIC-III.
Please use this identifier to cite or link to this item: