Engineer friends often ask me: Graph Deep Learning sounds great, but are there any big commercial success stories? Is it being deployed in practical applications? Besides the obvious ones–recommendation systems at Pinterest, Alibaba and Twitter–a slightly nuanced success story is the Transformer architecture, which has taken the NLP industry by storm. Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
Debate map visualization of: Join a global community of mappers with over 400,000 ideas and 20,000 maps. Click on the bubbles to explore, and log-in to create free public & private maps on any topic!
Microsoft Research collaborates with computer scientists at academic and scientific institutions to promote advances in computing technologies and research.
S. Wang, L. Hu, Y. Wang, X. He, Q. Sheng, M. Orgun, L. Cao, F. Ricci, и P. Yu. (2021)cite arxiv:2105.06339Comment: Accepted by IJCAI 2021 Survey Track, copyright is owned to IJCAI. The first systematic survey on graph learning based recommender systems. arXiv admin note: text overlap with arXiv:2004.11718.
M. Ryabinin, S. Popov, L. Prokhorenkova, и E. Voita. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), стр. 7317--7331. Online, Association for Computational Linguistics, (ноября 2020)
G. Lee, S. Kang, и J. Whang. Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, (июля 2019)
W. Hamilton, R. Ying, und J. Leskovec. (2017)cite arxiv:1709.05584Comment: Published in the IEEE Data Engineering Bulletin, September 2017; version with minor corrections.