An empirical study towards prompt-tuning for graph contrastive pre-training in recommendations

Haoran YANG, Xiangyu ZHAO, Yicong LI, Hongxu CHEN, Guandong XU

Research output: Chapter in Book/Report/Conference proceedingChapters

3 Citations (Scopus)

Abstract

Graph contrastive learning (GCL) has emerged as an effective technology for various graph learning tasks. It has been successfully applied in real-world recommender systems, where the contrastive loss and downstream recommendation objectives are combined to form the overall objective function. However, this approach deviates from the original GCL paradigm, which pre-trains graph embeddings without involving downstream training objectives. In this paper, we propose a novel framework called CPTPP, which enhances GCL-based recommender systems by leveraging prompt tuning. This framework allows us to fully exploit the advantages of the original GCL protocol. Specifically, we first summarize user profiles in graph recommender systems to automatically generate personalized user prompts. These prompts are then combined with pre-trained user embeddings for prompt tuning in downstream tasks. This helps bridge the gap between pre-training and downstream tasks. Our extensive experiments on three benchmark datasets confirm the effectiveness of CPTPP compared to state-of-the-art baselines. Additionally, a visualization experiment illustrates that user embeddings generated by CPTPP have a more uniform distribution, indicating improved modeling capability for user preferences. The implementation code is available online2 for reproducibility. Copyright © 2023 Neural information processing systems foundation.

Original languageEnglish
Title of host publicationAdvances in neural information processing systems
Place of PublicationNew Orleans, USA
PublisherNeural Information Processing Systems Foundation
Volume36
ISBN (Print)9781713899921
Publication statusPublished - 2023

Citation

Yang, H., Zhao, X., Li, Y., Chen, H., & Xu, G. (2023). An empirical study towards prompt-tuning for graph contrastive pre-training in recommendations. In Advances in neural information processing systems (Vol. 36). Neural Information Processing Systems Foundation. https://proceedings.neurips.cc/paper_files/paper/2023/hash/c6af791af7ef0f3e02bccef011211ca5-Abstract-Conference.html

Fingerprint

Dive into the research topics of 'An empirical study towards prompt-tuning for graph contrastive pre-training in recommendations'. Together they form a unique fingerprint.