Multi-level graph knowledge contrastive learning

Haoran YANG, Yuhao WANG, Xiangyu ZHAO, Hongxu CHEN, Hongzhi YIN, Qing LI, Guandong XU

Research output: Contribution to journalArticlespeer-review

Abstract

Graph Contrastive Learning (GCL) stands as a potent framework for unsupervised graph representation learning that has gained traction across numerous graph learning applications. The effectiveness of GCL relies on generating high-quality contrasting samples, enhancing the model's ability to discern graph semantics. However, the prevailing GCL methods face two key challenges: 1) introducing noise during graph augmentations and 2) requiring additional storage for generated samples, which degrade the model performance. In this paper, we propose novel approaches, GKCL (i.e., Graph Knowledge Contrastive Learning) and DGKCL (i.e., Distilled Graph Knowledge Contrastive Learning), that leverage multilevel graph knowledge to create noise-free contrasting pairs. This framework not only addresses the noise-related challenges but also circumvents excessive storage demands. Furthermore, our method incorporates a knowledge distillation component to optimize the trained embedding tables, reducing the model's scale while ensuring superior performance, particularly for the scenarios with smaller embedding sizes. Copyright © 2024 IEEE.

Original languageEnglish
Pages (from-to)8829-8841
JournalIEEE Transactions on Knowledge and Data Engineering
Volume36
Issue number12
Early online dateSept 2024
DOIs
Publication statusPublished - Dec 2024

Citation

Yang, H., Wang, Y., Zhao, X., Chen, H., Yin, H., Li, Q., & Xu, G. (2024). Multi-level graph knowledge contrastive learning. IEEE Transactions on Knowledge and Data Engineering, 36(12), 8829-8841. https://doi.org/10.1109/TKDE.2024.3466530

Keywords

  • Graph representation learning
  • Graph contrastive learning
  • Knowledge distillation

Fingerprint

Dive into the research topics of 'Multi-level graph knowledge contrastive learning'. Together they form a unique fingerprint.