Graph-based Semi-Supervised Learning by Strengthening Local Label Consistency
- Publisher:
- ACM
- Publication Type:
- Conference Proceeding
- Citation:
- International Conference on Information and Knowledge Management, Proceedings, 2021, pp. 3201-3205
- Issue Date:
- 2021-10-26
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
3459637.3482114.pdf | Published version | 2.69 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Graph-based algorithms have drawn much attention thanks to their impressive success in semi-supervised setups. For better model performance, previous studies have learned to transform the topology of the input graph. However, these works only focus on optimizing the original nodes and edges, leaving the direction of augmenting existing data insufficiently explored. In this paper, we propose a novel heuristic pre-processing technique, namelyLocal Label Consistency Strengthening (ŁLCS), which automatically expands new nodes and edges to refine the label consistency within a dense subgraph. Our framework can effectively benefit downstream models by substantially enlarging the original training set with high-quality generated labeled data and refining the original graph topology. To justify the generality and practicality of ŁLCS, we couple it with the popular graph convolution network and graph attention network to perform extensive evaluations on three standard datasets. In all setups tested, our method boosts the average accuracy by a large margin of 4.7% and consistently outperforms the state-of-the-art.
Please use this identifier to cite or link to this item: