论文标题
选择性邻里建模的统一模型
A Unified Model for Recommendation with Selective Neighborhood Modeling
论文作者
论文摘要
基于社区的推荐人是一类主要的协作过滤(CF)模型。直觉是为了利用邻居,具有相似的偏好,以弥合看不见的用户项目对并减轻数据稀疏性。许多现有作品提出了神经注意网络,以汇总邻居,并将更高的权重放在用户的特定子集上以供推荐。但是,邻里信息不一定总是有用的,社区中的声音可能会对模型性能产生负面影响。为了解决这个问题,我们提出了一个基于邻里的新型推荐人,其中混合门网网络旨在自动将相似的邻居与不同(嘈杂)的邻居分开,并汇总了这些类似的邻居以包含邻里表示。如果我们对邻里信息充满信心,对社区的信心也可以解决邻里表示,反之亦然。此外,提出了一个用户 - 邻居组件,以明确规定潜在空间中的用户邻近。这两个组件合并为统一模型,以相互补充以完成建议任务。在三个公开数据集上进行的大量实验表明,该建议的模型始终优于最先进的基于邻里的推荐人。我们还研究了所提出模型的不同变体,以证明所提出的混合门网络和用户邻居建模组件的基本直觉是合理的。
Neighborhood-based recommenders are a major class of Collaborative Filtering (CF) models. The intuition is to exploit neighbors with similar preferences for bridging unseen user-item pairs and alleviating data sparseness. Many existing works propose neural attention networks to aggregate neighbors and place higher weights on specific subsets of users for recommendation. However, the neighborhood information is not necessarily always informative, and the noises in the neighborhood can negatively affect the model performance. To address this issue, we propose a novel neighborhood-based recommender, where a hybrid gated network is designed to automatically separate similar neighbors from dissimilar (noisy) ones, and aggregate those similar neighbors to comprise neighborhood representations. The confidence in the neighborhood is also addressed by putting higher weights on the neighborhood representations if we are confident with the neighborhood information, and vice versa. In addition, a user-neighbor component is proposed to explicitly regularize user-neighbor proximity in the latent space. These two components are combined into a unified model to complement each other for the recommendation task. Extensive experiments on three publicly available datasets show that the proposed model consistently outperforms state-of-the-art neighborhood-based recommenders. We also study different variants of the proposed model to justify the underlying intuition of the proposed hybrid gated network and user-neighbor modeling components.