UPRec: User-Aware Pre-training for Recommender Systems

02/22/2021
by   Chaojun Xiao, et al.
0

Existing sequential recommendation methods rely on large amounts of training data and usually suffer from the data sparsity problem. To tackle this, the pre-training mechanism has been widely adopted, which attempts to leverage large-scale data to perform self-supervised learning and transfer the pre-trained parameters to downstream tasks. However, previous pre-trained models for recommendation focus on leverage universal sequence patterns from user behaviour sequences and item information, whereas ignore capturing personalized interests with the heterogeneous user information, which has been shown effective in contributing to personalized recommendation. In this paper, we propose a method to enhance pre-trained models with heterogeneous user information, called User-aware Pre-training for Recommendation (UPRec). Specifically, UPRec leverages the user attributes andstructured social graphs to construct self-supervised objectives in the pre-training stage and proposes two user-aware pre-training tasks. Comprehensive experimental results on several real-world large-scale recommendation datasets demonstrate that UPRec can effectively integrate user information into pre-trained models and thus provide more appropriate recommendations for users.

READ FULL TEXT

page 3

page 5

page 6

page 7

page 8

page 9

page 10

page 11

research
09/19/2020

Knowledge Transfer via Pre-training for Recommendation: A Review and Prospect

Recommender systems aim to provide item recommendations for users, and a...
research
05/06/2023

Attacking Pre-trained Recommendation

Recently, a series of pioneer studies have shown the potency of pre-trai...
research
10/27/2020

Contrastive Pre-training for Sequential Recommendation

Sequential recommendation methods play a crucial role in modern recommen...
research
06/08/2023

COURIER: Contrastive User Intention Reconstruction for Large-Scale Pre-Train of Image Features

With the development of the multi-media internet, visual characteristics...
research
06/12/2021

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation

Due to the flexibility in modelling data heterogeneity, heterogeneous in...
research
08/20/2023

Enhancing Transformers without Self-supervised Learning: A Loss Landscape Perspective in Sequential Recommendation

Transformer and its variants are a powerful class of architectures for s...
research
02/18/2023

Ensemble Ranking Model with Multiple Pretraining Strategies for Web Search

An effective ranking model usually requires a large amount of training d...

Please sign up or login with your details

Forgot password? Click here to reset