Knowledge Graph enhanced Neural Collaborative Filtering with Residual Recurrent Network

Publisher:
ELSEVIER
Publication Type:
Journal Article
Citation:
Neurocomputing, 2021, 454, pp. 417-429
Issue Date:
2021-09-24
Filename Description Size
1-s2.0-S0925231221004276-main.pdfPublished version2.29 MB
Adobe PDF
Full metadata record
Knowledge Graph (KG), which commonly consists of fruitful connected facts about items, presents an unprecedented opportunity to alleviate the sparsity problem in recommender system. However, existing KG based recommendation methods mainly rely on handcrafted meta-path features or simple triple-level entity embedding, which cannot automatically capture entities’ long-term relational dependencies for the recommendation. Specially, entity embedding learning is not properly designed to combine user-item interaction information with KG context information. In this paper, a two-channel neural interaction method named Knowledge Graph enhanced Neural Collaborative Filtering with Residual Recurrent Network (KGNCF-RRN) is proposed, which leverages both long-term relational dependencies KG context and user-item interaction for recommendation. (1) For the KG context interaction channel, we propose Residual Recurrent Network (RRN) to construct context-based path embedding, which incorporates residual learning into traditional recurrent neural networks (RNNs) to efficiently encode the long-term relational dependencies of KG. The self-attention network is then applied to the path embedding to capture the polysemy of various user interaction behaviours. (2) For the user-item interaction channel, the user and item embeddings are fed into a newly designed two-dimensional interaction map. (3) Finally, above the two-channel neural interaction matrix, we employ a convolutional neural network to learn complex correlations between user and item. Extensive experimental results on three benchmark datasets show that our proposed approach outperforms existing state-of-the-art approaches for knowledge graph based recommendation.
Please use this identifier to cite or link to this item: