A relation-specific attention network for joint entity and relation extraction

Publisher:
International Joint Conference on Artificial Intelligence
Publication Type:
Conference Proceeding
Citation:
IJCAI International Joint Conference on Artificial Intelligence, 2021, 2021-January, pp. 4054-4060
Issue Date:
2021-01-01
Full metadata record
Joint extraction of entities and relations is an important task in natural language processing (NLP), which aims to capture all relational triplets from plain texts. This is a big challenge due to some of the triplets extracted from one sentence may have overlapping entities. Most existing methods perform entity recognition followed by relation detection between every possible entity pairs, which usually suffers from numerous redundant operations. In this paper, we propose a relation-specific attention network (RSAN) to handle the issue. Our RSAN utilizes relation-aware attention mechanism to construct specific sentence representations for each relation, and then performs sequence labeling to extract its corresponding head and tail entities. Experiments on two public datasets show that our model can effectively extract overlapping triplets and achieve state-of-the-art performance.
Please use this identifier to cite or link to this item: