Attention Calibration for Transformer-based Sequential Recommendation

08/18/2023
by   Peilin Zhou, et al.
0

Transformer-based sequential recommendation (SR) has been booming in recent years, with the self-attention mechanism as its key component. Self-attention has been widely believed to be able to effectively select those informative and relevant items from a sequence of interacted items for next-item prediction via learning larger attention weights for these items. However, this may not always be true in reality. Our empirical analysis of some representative Transformer-based SR models reveals that it is not uncommon for large attention weights to be assigned to less relevant items, which can result in inaccurate recommendations. Through further in-depth analysis, we find two factors that may contribute to such inaccurate assignment of attention weights: sub-optimal position encoding and noisy input. To this end, in this paper, we aim to address this significant yet challenging gap in existing works. To be specific, we propose a simple yet effective framework called Attention Calibration for Transformer-based Sequential Recommendation (AC-TSR). In AC-TSR, a novel spatial calibrator and adversarial calibrator are designed respectively to directly calibrates those incorrectly assigned attention weights. The former is devised to explicitly capture the spatial relationships (i.e., order and distance) among items for more precise calculation of attention weights. The latter aims to redistribute the attention weights based on each item's contribution to the next-item prediction. AC-TSR is readily adaptable and can be seamlessly integrated into various existing transformer-based SR models. Extensive experimental results on four benchmark real-world datasets demonstrate the superiority of our proposed ACTSR via significant recommendation performance enhancements. The source code is available at https://github.com/AIM-SE/AC-TSR.

READ FULL TEXT
research
04/23/2022

Decoupled Side Information Fusion for Sequential Recommendation

Side information fusion for sequential recommendation (SR) aims to effec...
research
10/04/2021

HyperTeNet: Hypergraph and Transformer-based Neural Network for Personalized List Continuation

The personalized list continuation (PLC) task is to curate the next item...
research
01/16/2022

Sequential Recommendation via Stochastic Self-Attention

Sequential recommendation models the dynamics of a user's previous behav...
research
05/18/2022

AdaMCT: Adaptive Mixture of CNN-Transformer for Sequential Recommendation

Sequential recommendation (SR) aims to model users' dynamic preferences ...
research
12/08/2022

Denoising Self-attentive Sequential Recommendation

Transformer-based sequential recommenders are very powerful for capturin...
research
07/30/2020

Interpretable Contextual Team-aware Item Recommendation: Application in Multiplayer Online Battle Arena Games

The video game industry has adopted recommendation systems to boost user...
research
12/12/2022

Tensor-based Sequential Learning via Hankel Matrix Representation for Next Item Recommendations

Self-attentive transformer models have recently been shown to solve the ...

Please sign up or login with your details

Forgot password? Click here to reset