A Determinantal Point Process Based Novel Sampling Method of Abstractive Text Summarization

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2023 International Joint Conference on Neural Networks (IJCNN), 2023, 2023-June
Issue Date:
2023-01-01
Full metadata record
In recent years abstractive text summarization ATS research has made considerable progress attributed to two key improvements deep neural modeling and likelihood estimation based sampling in the end to end optimization training While modeling has grounded on a few de facto highly capable base models within encoder decoder architecture novel sampling ideas such as random masking classification and generative prediction by unsupervised learning have also been explored They aim at improving prior knowledge particularly of language modeling for downstream tasks It has led to the notable performance gain of ATS But several challenges remain for example undesirable word repeats In this paper we propose a determinantal point process DPP based novel sampling method to address the issue It can be easily integrated with the existing ATS models Our experiments and subsequent analysis have revealed that the adopted models trained by our sampling method reduce undesirable word repeats and improve word coverage while achieving competitive ROUGE scores
Please use this identifier to cite or link to this item: