Author of the publication

Edinburgh's Submission to the WMT 2022 Efficiency Task.

, , , , , , and . WMT, page 661-667. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Efficient Machine Translation with Model Pruning and Quantization., , , , , , , , , and 1 other author(s). WMT@EMNLP, page 775-780. Association for Computational Linguistics, (2021)The Llama 3 Herd of Models., , , , , , , , , and 91 other author(s). CoRR, (2024)A New Massive Multilingual Dataset for High-Performance Language Technologies., , , , , , , , , and 3 other author(s). LREC/COLING, page 1116-1128. ELRA and ICCL, (2024)Findings of the WMT 2022 Shared Task on Efficient Translation., , , , and . WMT, page 100-108. Association for Computational Linguistics, (2022)Edinburgh's Submission to the WMT 2022 Efficiency Task., , , , , , and . WMT, page 661-667. Association for Computational Linguistics, (2022)The Llama 3 Herd of Models, , , , , , , , , and 523 other author(s). (2024)OpusCleaner and OpusTrainer, open source toolkits for training Machine Translation and Large language models., , , , , , , , , and . CoRR, (2023)HPLT: High Performance Language Technologies., , , , , , , and . EAMT, page 517-518. European Association for Machine Translation, (2023)