@dblp

Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation.

, , and . EMNLP (1), page 1754-1765. Association for Computational Linguistics, (2021)

Links and resources

Tags