论文标题
带有重复使用的项目表示的递归专注方法,以进行顺序建议
Recursive Attentive Methods with Reused Item Representations for Sequential Recommendation
论文作者
论文摘要
顺序建议旨在根据其历史互动推荐下一项用户兴趣的项目。最近,自我发项机制已被调整以进行顺序建议,并证明了最先进的表现。但是,在本手稿中,我们表明基于自我注意的顺序建议方法可能会遭受本地化问题的困扰。结果,在这些方法中,在前几个块中,项目表示可能会迅速与其原始表示形式不同,从而损害了以下块中的学习。为了减轻此问题,在本手稿中,我们开发了一种使用重复使用的项目表示(RAM)进行顺序建议的递归细心方法。我们将RAM与六个公共基准数据集上的五种最先进的基线方法进行了比较。我们的实验结果表明,RAM在基准数据集上的基线方法明显优于基线方法,提高了11.3%。我们的稳定性分析表明,RAM可以实现更深入和更广泛的模型,以提高性能。我们的运行时间性能比较表明,RAM在基准数据集上也可能更有效。
Sequential recommendation aims to recommend the next item of users' interest based on their historical interactions. Recently, the self-attention mechanism has been adapted for sequential recommendation, and demonstrated state-of-the-art performance. However, in this manuscript, we show that the self-attention-based sequential recommendation methods could suffer from the localization-deficit issue. As a consequence, in these methods, over the first few blocks, the item representations may quickly diverge from their original representations, and thus, impairs the learning in the following blocks. To mitigate this issue, in this manuscript, we develop a recursive attentive method with reused item representations (RAM) for sequential recommendation. We compare RAM with five state-of-the-art baseline methods on six public benchmark datasets. Our experimental results demonstrate that RAM significantly outperforms the baseline methods on benchmark datasets, with an improvement of as much as 11.3%. Our stability analysis shows that RAM could enable deeper and wider models for better performance. Our run-time performance comparison signifies that RAM could also be more efficient on benchmark datasets.