Enhancing Locally Adaptive Smoothing of Graph Neural Networks Via Laplacian Node Disagreement

Publisher:
IEEE COMPUTER SOC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Knowledge and Data Engineering, 2024, 36, (3), pp. 1099-1112
Issue Date:
2024-03-01
Filename Description Size
1662105.pdfPublished version2.33 MB
Adobe PDF
Full metadata record
Graph neural networks (GNNs) are designed to perform inference on data described by graph-structured node features and topology information. From the perspective of graph signal denoising, the typical message passing schemes of GNNs act as a globally uniform smoothing that minimizes disagreements between embeddings of connected nodes. However, the level of smoothing over different regions of the graph should be different, especially for those inter-class regions. This deviation limits the expressiveness of GNNs, and then renders them fragile to over-smoothing, long-range dependencies, and non-homophily settings. In this paper, we find that the node disagreements of initial graph features can present more trustworthy constraints on node embeddings, thereby enhancing the locally adaptive smoothing of GNNs. To spread the inherent disagreements of nodes, we propose the Laplacian node disagreement to jointly measure the initial features and output embeddings. With such a measurement, we then present a new graph signal denoising objective deriving a more effective message passing scheme and further incorporate it into the GNN architecture, named Laplacian node disagreement-based GNN (LND-GNN). Learning from its output node representations, we integrate an auxiliary disagreement constraint into the overall classification loss. Experiments demonstrate the expressive ability of LND-GNN in the downstream semi-supervised node classification task.
Please use this identifier to cite or link to this item: