Multi-task learning for relation extraction

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
Proceedings - International Conference on Tools with Artificial Intelligence, ICTAI, 2020, 2019-November, pp. 1480-1487
Issue Date:
2020
Filename Description Size
08995371.pdf1.3 MB
Adobe PDF
Full metadata record
© 2019 IEEE. Distantly supervised relation extraction leverages knowledge bases to label training data automatically. However, distant supervision may introduce incorrect labels, which harm the performance. Many efforts have been devoted to tackling this problem, but most of them treat relation extraction as a simple classification task. As a result, they ignore useful information that comes from related tasks, i.e., dependency parsing and entity type classification. In this paper, we first propose a novel Multi-Task learning framework for Relation Extraction (MTRE). We employ dependency parsing and entity type classification as auxiliary tasks and relation extraction as the target task. We learn these tasks simultaneously from training instances to take advantage of inductive transfer between auxiliary tasks and the target task. Then we construct a hierarchical neural network, which incorporates dependency and entity representations from auxiliary tasks into a more robust relation representation against the noisy labels. The experimental results demonstrate that our model improves the predictive performance substantially over single-task learning baselines.
Please use this identifier to cite or link to this item: