Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation

by   Fajie Yuan, et al.

Inductive transfer learning has had a big impact on computer vision and NLP domains but has not been used in the area of recommender systems. Even though there has been a large body of research on generating recommendations based on modeling user-item interaction sequences, few of them attempt to represent and transfer these models for serving downstream tasks where only limited data exists. In this paper, we delve on the task of effectively learning a single user representation that can be applied to a diversity of tasks, from cross-domain recommendations to user profile predictions. Fine-tuning a large pre-trained network and adapting it to downstream tasks is an effective way to solve such tasks. However, fine-tuning is parameter inefficient considering that an entire model needs to be re-trained for every new task. To overcome this issue, we develop a parameter efficient transfer learning architecture, termed as PeterRec, which can be configured on-the-fly to various downstream tasks. Specifically, PeterRec allows the pre-trained parameters to remain unaltered during fine-tuning by injecting a series of re-learned neural networks, which are small but as expressive as learning the entire network. We perform extensive experimental ablation to show the effectiveness of the learned user representation in five downstream tasks. Moreover, we show that PeterRec performs efficient transfer learning in multiple domains, where it achieves comparable or sometimes better performance relative to fine-tuning the entire model parameters.


Parameter-Efficient Transfer from Sequential Behaviors for User Profiling and Recommendation

Inductive transfer learning has greatly impacted the computer vision and...

Parameter-Efficient Transfer Learning for NLP

Fine-tuning large pre-trained models is an effective transfer mechanism ...

One Network, Many Masks: Towards More Parameter-Efficient Transfer Learning

Fine-tuning pre-trained language models for multiple tasks tends to be e...

Language-Informed Transfer Learning for Embodied Household Activities

For service robots to become general-purpose in everyday household envir...

Towards a Unified View on Visual Parameter-Efficient Transfer Learning

Since the release of various large-scale natural language processing (NL...

Integrated Parameter-Efficient Tuning for General-Purpose Audio Models

The advent of hyper-scale and general-purpose pre-trained models is shif...

Exploring Adapter-based Transfer Learning for Recommender Systems: Empirical Studies and Practical Insights

Adapters, a plug-in neural network module with some tunable parameters, ...

Please sign up or login with your details

Forgot password? Click here to reset