Adaptive knowledge subgraph ensemble for robust and trustworthy knowledge graph completion
- Publisher:
- Springer (part of Springer Nature)
- Publication Type:
- Journal Article
- Citation:
- World Wide Web, 2020, 23, (1), pp. 471-490
- Issue Date:
- 2020
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
Wan2020_Article_AdaptiveKnowledgeSubgraphEnsem.pdf | Published version | 1.21 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2019, Springer Science+Business Media, LLC, part of Springer Nature. Knowledge graph (KG) embedding approaches are widely used to infer underlying missing facts based on intrinsic structure information. However, the presence of noisy facts in automatically extracted or crowdsourcing KGs significantly reduces the reliability of various embedding learners. In this paper, we thoroughly study the underlying reasons for the performance drop in dealing with noisy knowledge graphs, and we propose an ensemble framework, Adaptive Knowledge Subgraph Ensemble (AKSE), to enhance the robustness and trust of knowledge graph completion. By employing an effective knowledge subgraph extraction approach to re-sample the sub-components from the original knowledge graph, AKSE generates different representations for learning diversified base learners (e.g., TransE and DistMult), which substantially alleviates the noise effect of KG embedding. All embedding learners are integrated into a unified framework to reduce generalization errors via our simple or adaptive weighting schemes, where the weight is allocated based on each individual learner’s prediction capacity. Experimental results show that the robustness of our ensemble framework outperforms exiting knowledge graph embedding approaches on manually injected noise as well as inherent noisy extracted KGs.
Please use this identifier to cite or link to this item: