Correlation Encoder-Decoder Model for Text Generation
- Publisher:
- IEEE
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings of the International Joint Conference on Neural Networks, 2022, 2022-July, pp. 1-7
- Issue Date:
- 2022-01-01
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Text generation is crucial for many applications in natural language processing. With the prevalence of deep learning, the encoder-decoder architecture is dominantly adopted for this task. Accurately encoding the source information is of key importance to text generation, because the target text can be generated only when accurate and complete source information is captured by the encoder and fed into the decoder. However, most existing approaches fail to effectively encode and learn the entire source information, as some features are easy to be missed along with the encoding procedures of the encoder. Similar problems also confuse the implementation of the decoder. How to reduce the problem of information loss in the encoder-decoder model is critical for text generation. To address this issue, we propose a novel correlation encoder-decoder model, which optimizes both the encoder and the decoder to reduce the problem of information loss by enforcing them to minimize the differences between hierarchical layers by maximizing the mutual information. Experimental results on two benchmark datasets demonstrate that the proposed model substantially outperforms the existing state-of-the-art methods. Our source code is publicly available on GitHub1.
Please use this identifier to cite or link to this item: