论文标题

剥夺自动序列建议

Denoising Self-attentive Sequential Recommendation

论文作者

Chen, Huiyuan, Lin, Yusan, Pan, Menghai, Wang, Lan, Yeh, Chin-Chia Michael, Li, Xiaoting, Zheng, Yan, Wang, Fei, Yang, Hao

论文摘要

基于变压器的顺序推荐程序非常有力,可以捕获短期和长期顺序项目依赖项。这主要归因于其独特的自我发项网络,以利用序列中的成对项目 - 项目相互作用。但是,现实世界中的项目序列通常是嘈杂的,对于隐式反馈而言,这尤其如此。例如,很大一部分点击与用户偏好不太吻合,许多产品最终会得到负面评论或返回。因此,当前的用户操作仅取决于项目的子集,而不取决于整个序列。许多现有的基于变压器的模型都使用了全部注意力分布,这不可避免地将某些学分分配给无关的项目。如果变压器未正确正规化,这可能会导致次优性能。

Transformer-based sequential recommenders are very powerful for capturing both short-term and long-term sequential item dependencies. This is mainly attributed to their unique self-attention networks to exploit pairwise item-item interactions within the sequence. However, real-world item sequences are often noisy, which is particularly true for implicit feedback. For example, a large portion of clicks do not align well with user preferences, and many products end up with negative reviews or being returned. As such, the current user action only depends on a subset of items, not on the entire sequences. Many existing Transformer-based models use full attention distributions, which inevitably assign certain credits to irrelevant items. This may lead to sub-optimal performance if Transformers are not regularized properly.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源