Abstract
Graph Contrastive Learning (GCL) stands as a potent framework for unsupervised graph representation learning that has gained traction across numerous graph learning applications. The effectiveness of GCL relies on generating high-quality contrasting samples, enhancing the model's ability to discern graph semantics. However, the prevailing GCL methods face two key challenges: 1) introducing noise during graph augmentations and 2) requiring additional storage for generated samples, which degrade the model performance. In this paper, we propose novel approaches, GKCL (i.e., Graph Knowledge Contrastive Learning) and DGKCL (i.e., Distilled Graph Knowledge Contrastive Learning), that leverage multilevel graph knowledge to create noise-free contrasting pairs. This framework not only addresses the noise-related challenges but also circumvents excessive storage demands. Furthermore, our method incorporates a knowledge distillation component to optimize the trained embedding tables, reducing the model's scale while ensuring superior performance, particularly for the scenarios with smaller embedding sizes. Copyright © 2024 IEEE.
Original language | English |
---|---|
Pages (from-to) | 8829-8841 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
Volume | 36 |
Issue number | 12 |
Early online date | Sept 2024 |
DOIs | |
Publication status | Published - Dec 2024 |
Citation
Yang, H., Wang, Y., Zhao, X., Chen, H., Yin, H., Li, Q., & Xu, G. (2024). Multi-level graph knowledge contrastive learning. IEEE Transactions on Knowledge and Data Engineering, 36(12), 8829-8841. https://doi.org/10.1109/TKDE.2024.3466530Keywords
- Graph representation learning
- Graph contrastive learning
- Knowledge distillation