Abstract
We formalize networks with evolving structures as temporal networks and propose a generative link prediction model, Generative Link Sequence Modeling (GLSM), to predict future links for temporal networks. GLSM captures the temporal link formation patterns from the observed links with a sequence modeling framework and has the ability to generate the emerging links by inferring from the probability distribution on the potential future links. To avoid overfitting caused by treating each link as a unique token, we propose a self-tokenization mechanism to transform each raw link in the network to an abstract aggregation token automatically. The self-tokenization is seamlessly integrated into the sequence modeling framework, which allows the proposed GLSM model to have the generalization capability to discover link formation patterns beyond raw link sequences. We compare GLSM with the existing state-of-art methods on five real-world datasets. The experimental results demonstrate that GLSM obtains future positive links effectively in a generative fashion while achieving the best performance (2-10% improvements on AUC) among other alternatives. Copyright © 2020 Springer Science+Business Media, LLC, part of Springer Nature.
Original language | English |
---|---|
Pages (from-to) | 2471-2488 |
Journal | World Wide Web |
Volume | 23 |
Early online date | May 2020 |
DOIs | |
Publication status | Published - Jul 2020 |
Citation
Wang, Y., Zhang, C., Wang, S., Yu, P. S., Bai, L., Cui, L. & Xu, G. (2020). Generative temporal link prediction via self-tokenized sequence modeling. World Wide Web, 23, 2471-2488. https://doi.org/10.1007/s11280-020-00821-yKeywords
- Temporal link prediction
- Sequence modeling
- Recurrent neural network
- Self-tokenization mechanism