Inproceedings,

Cross-Attention is All You Need: Adapting Pretrained Transformers for Machine Translation.

, , and .
EMNLP (1), page 1754-1765. Association for Computational Linguistics, (2021)

Meta data

Tags

Users

  • @dblp

Comments and Reviews