Temporal Network Embedding for Link Prediction via VAE Joint Attention Mechanism.

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Trans Neural Netw Learn Syst, 2022, 33, (12), pp. 7400-7413
Issue Date:
2022-12
Full metadata record
Network representation learning or embedding aims to project the network into a low-dimensional space that can be devoted to different network tasks. Temporal networks are an important type of network whose topological structure changes over time. Compared with methods on static networks, temporal network embedding (TNE) methods are facing three challenges: 1) it cannot describe the temporal dependence across network snapshots; 2) the node embedding in the latent space fails to indicate changes in the network topology; and 3) it cannot avoid a lot of redundant computation via parameter inheritance on a series of snapshots. To overcome these problems, we propose a novel TNE method named temporal network embedding method based on the VAE framework (TVAE), which is based on a variational autoencoder (VAE) to capture the evolution of temporal networks for link prediction. It not only generates low-dimensional embedding vectors for nodes but also preserves the dynamic nonlinear features of temporal networks. Through the combination of a self-attention mechanism and recurrent neural networks, TVAE can update node representations and keep the temporal dependence of vectors over time. We utilize parameter inheritance to keep the new embedding close to the previous one, rather than explicitly using regularization, and thus, it is effective for large-scale networks. We evaluate our model and several baselines on synthetic data sets and real-world networks. The experimental results demonstrate that TVAE has superior performance and lower time cost compared with the baselines.
Please use this identifier to cite or link to this item: