SK2: Integrating Implicit Sentiment Knowledge and Explicit Syntax Knowledge for Aspect-Based Sentiment Analysis

Publisher:
ACM
Publication Type:
Conference Proceeding
Citation:
International Conference on Information and Knowledge Management, Proceedings, 2022, pp. 1114-1123
Issue Date:
2022-10-17
Filename Description Size
3511808.3557452.pdfPublished version1.23 MB
Adobe PDF
Full metadata record
Aspect-based sentiment analysis (ABSA) plays an indispensable role in web mining and retrieval system as it involves a wide range of tasks, including aspect term extraction, opinion term extraction, aspect sentiment classification, etc. Early works are merely applicable to a part of these tasks, leading to computation-unfriendly models and a pipeline framework. Recently, a unified framework has been proposed to learn all these ABSA tasks in an end-to-end fashion. Despite its versatility, its performance is still sub-optimal since ABSA tasks depend heavily on both sentiment and syntax knowledge, but existing task-specific knowledge integration methods are hardly applicable to such a unified framework. Therefore, we propose a brand-new unified framework for ABSA in this work, which incorporates both implicit sentiment knowledge and explicit syntax knowledge to better complete all ABSA tasks. To effectively incorporate implicit sentiment knowledge, we first design a self-supervised pre-training procedure that is general enough to all ABSA tasks. It consists of conjunctive words prediction (CWP) task, sentiment-word polarity prediction (SPP) task, attribute nouns prediction (ANP) task, and sentiment-oriented masked language modeling (SMLM) task. Empowered by the pre-training procedure, our framework acquires strong abilities in sentiment representation and sentiment understanding. Meantime, considering a subtle syntax variation can significantly affect ABSA, we further explore a sparse relational graph attention network (SR-GAT) to introduce explicit aspect-oriented syntax knowledge. By combining both worlds of knowledge, our unified model can better represent and understand the input texts towards all ABSA tasks. Extensive experiments show that our proposed framework achieves consistent and significant improvements on all ABSA tasks.
Please use this identifier to cite or link to this item: