BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer
F. Sun, J. Liu, J. Wu, C. Pei, X. Lin, W. Ou, and P. Jiang. Proceedings of the 28th ACM International Conference on Information and Knowledge Management, page 1441–1450. New York, NY, USA, Association for Computing Machinery, (2019)
DOI: 10.1145/3357384.3357895
Abstract
Modeling users' dynamic preferences from their historical behaviors is challenging and crucial for recommendation systems. Previous methods employ sequential neural networks to encode users' historical interactions from left to right into hidden representations for making recommendations. Despite their effectiveness, we argue that such left-to-right unidirectional models are sub-optimal due to the limitations including: begin enumerate* label=seriesitshapealph*upshape) item unidirectional architectures restrict the power of hidden representation in users' behavior sequences; item they often assume a rigidly ordered sequence which is not always practical. end enumerate* To address these limitations, we proposed a sequential recommendation model called BERT4Rec, which employs the deep bidirectional self-attention to model user behavior sequences. To avoid the information leakage and efficiently train the bidirectional model, we adopt the Cloze objective to sequential recommendation, predicting the random masked items in the sequence by jointly conditioning on their left and right context. In this way, we learn a bidirectional representation model to make recommendations by allowing each item in user historical behaviors to fuse information from both left and right sides. Extensive experiments on four benchmark datasets show that our model outperforms various state-of-the-art sequential models consistently.
Description
Really interesting, peer reviewed and possibly one of the main papers
%0 Conference Paper
%1 10.1145/3357384.3357895
%A Sun, Fei
%A Liu, Jun
%A Wu, Jian
%A Pei, Changhua
%A Lin, Xiao
%A Ou, Wenwu
%A Jiang, Peng
%B Proceedings of the 28th ACM International Conference on Information and Knowledge Management
%C New York, NY, USA
%D 2019
%I Association for Computing Machinery
%K thema:Visualbert(Sequential)
%P 1441–1450
%R 10.1145/3357384.3357895
%T BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer
%U https://doi.org/10.1145/3357384.3357895
%X Modeling users' dynamic preferences from their historical behaviors is challenging and crucial for recommendation systems. Previous methods employ sequential neural networks to encode users' historical interactions from left to right into hidden representations for making recommendations. Despite their effectiveness, we argue that such left-to-right unidirectional models are sub-optimal due to the limitations including: begin enumerate* label=seriesitshapealph*upshape) item unidirectional architectures restrict the power of hidden representation in users' behavior sequences; item they often assume a rigidly ordered sequence which is not always practical. end enumerate* To address these limitations, we proposed a sequential recommendation model called BERT4Rec, which employs the deep bidirectional self-attention to model user behavior sequences. To avoid the information leakage and efficiently train the bidirectional model, we adopt the Cloze objective to sequential recommendation, predicting the random masked items in the sequence by jointly conditioning on their left and right context. In this way, we learn a bidirectional representation model to make recommendations by allowing each item in user historical behaviors to fuse information from both left and right sides. Extensive experiments on four benchmark datasets show that our model outperforms various state-of-the-art sequential models consistently.
%@ 9781450369763
@inproceedings{10.1145/3357384.3357895,
abstract = {Modeling users' dynamic preferences from their historical behaviors is challenging and crucial for recommendation systems. Previous methods employ sequential neural networks to encode users' historical interactions from left to right into hidden representations for making recommendations. Despite their effectiveness, we argue that such left-to-right unidirectional models are sub-optimal due to the limitations including: begin enumerate* [label=seriesitshapealph*upshape)] item unidirectional architectures restrict the power of hidden representation in users' behavior sequences; item they often assume a rigidly ordered sequence which is not always practical. end enumerate* To address these limitations, we proposed a sequential recommendation model called BERT4Rec, which employs the deep bidirectional self-attention to model user behavior sequences. To avoid the information leakage and efficiently train the bidirectional model, we adopt the Cloze objective to sequential recommendation, predicting the random masked items in the sequence by jointly conditioning on their left and right context. In this way, we learn a bidirectional representation model to make recommendations by allowing each item in user historical behaviors to fuse information from both left and right sides. Extensive experiments on four benchmark datasets show that our model outperforms various state-of-the-art sequential models consistently.},
added-at = {2021-06-05T13:38:49.000+0200},
address = {New York, NY, USA},
author = {Sun, Fei and Liu, Jun and Wu, Jian and Pei, Changhua and Lin, Xiao and Ou, Wenwu and Jiang, Peng},
biburl = {https://www.bibsonomy.org/bibtex/23eaef9e11ee63c7c5a1876c6a33e29d9/jan_seidlein},
booktitle = {Proceedings of the 28th ACM International Conference on Information and Knowledge Management},
description = {Really interesting, peer reviewed and possibly one of the main papers},
doi = {10.1145/3357384.3357895},
interhash = {aa54f875d7c91f45115f84114560f8cd},
intrahash = {3eaef9e11ee63c7c5a1876c6a33e29d9},
isbn = {9781450369763},
keywords = {thema:Visualbert(Sequential)},
location = {Beijing, China},
numpages = {10},
pages = {1441–1450},
publisher = {Association for Computing Machinery},
series = {CIKM '19},
timestamp = {2021-06-05T13:38:49.000+0200},
title = {BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer},
url = {https://doi.org/10.1145/3357384.3357895},
year = 2019
}