From post

Rakuten's Participation in WAT 2021: Examining the Effectiveness of Pre-trained Models for Multilingual and Multimodal Machine Translation.

, , , , и . WAT@ACL/IJCNLP, стр. 96-105. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Rakuten's Participation in WAT 2021: Examining the Effectiveness of Pre-trained Models for Multilingual and Multimodal Machine Translation., , , , и . WAT@ACL/IJCNLP, стр. 96-105. Association for Computational Linguistics, (2021)Goku's Participation in WAT 2020., и . WAT@AAC/IJCNLPL, стр. 135-141. Association for Computational Linguistics, (2020)The NICT translation system for IWSLT 2012., , и . IWSLT, стр. 121-125. ISCA, (2012)Rakuten's Participation in WAT 2022: Parallel Dataset Filtering by Leveraging Vocabulary Heterogeneity., , , , , и . WAT@COLING, стр. 68-72. International Conference on Computational Linguistics, (2022)Sakura at SemEval-2023 Task 2: Data Augmentation via Translation., , и . SemEval@ACL, стр. 1718-1722. Association for Computational Linguistics, (2023)Sarah's Participation in WAT 2019., , и . WAT@EMNLP-IJCNLP, стр. 152-158. Association for Computational Linguistics, (2019)