Generative temporal link prediction via self-tokenized sequence modeling

Yue WANG, Chenwei ZHANG, Shen WANG, Philip S. YU, Lu BAI, Lixin CUI, Guandong XU

Research output: Contribution to journalArticlespeer-review

3 Citations (Scopus)

Abstract

We formalize networks with evolving structures as temporal networks and propose a generative link prediction model, Generative Link Sequence Modeling (GLSM), to predict future links for temporal networks. GLSM captures the temporal link formation patterns from the observed links with a sequence modeling framework and has the ability to generate the emerging links by inferring from the probability distribution on the potential future links. To avoid overfitting caused by treating each link as a unique token, we propose a self-tokenization mechanism to transform each raw link in the network to an abstract aggregation token automatically. The self-tokenization is seamlessly integrated into the sequence modeling framework, which allows the proposed GLSM model to have the generalization capability to discover link formation patterns beyond raw link sequences. We compare GLSM with the existing state-of-art methods on five real-world datasets. The experimental results demonstrate that GLSM obtains future positive links effectively in a generative fashion while achieving the best performance (2-10% improvements on AUC) among other alternatives. Copyright © 2020 Springer Science+Business Media, LLC, part of Springer Nature. 

Original languageEnglish
Pages (from-to)2471-2488
JournalWorld Wide Web
Volume23
Early online dateMay 2020
DOIs
Publication statusPublished - Jul 2020

Citation

Wang, Y., Zhang, C., Wang, S., Yu, P. S., Bai, L., Cui, L. & Xu, G. (2020). Generative temporal link prediction via self-tokenized sequence modeling. World Wide Web, 23, 2471-2488. https://doi.org/10.1007/s11280-020-00821-y

Keywords

  • Temporal link prediction
  • Sequence modeling
  • Recurrent neural network
  • Self-tokenization mechanism

Fingerprint

Dive into the research topics of 'Generative temporal link prediction via self-tokenized sequence modeling'. Together they form a unique fingerprint.