Author of the publication

Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters.

, , and . CoRR, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Taken out of context: On measuring situational awareness in LLMs., , , , , , , and . CoRR, (2023)Robustification of Multilingual Language Models to Real-world Noise in Crosslingual Zero-shot Settings with Robust Contrastive Pretraining., , , , and . EACL, page 1367-1383. Association for Computational Linguistics, (2023)Deep Transformers with Latent Depth., , , and . NeurIPS, (2020)Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation., , and . CoRR, (2020)BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning., and . ICML, volume 97 of Proceedings of Machine Learning Research, page 5986-5995. PMLR, (2019)Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters., , and . WMT@EMNLP, page 578-598. Association for Computational Linguistics, (2021)Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation., , and . EACL, page 3440-3453. Association for Computational Linguistics, (2021)The Reversal Curse: LLMs trained on Ä is B" fail to learn "B is A"., , , , , , and . CoRR, (2023)Robustification of Multilingual Language Models to Real-world Noise with Robust Contrastive Pretraining., , , , and . CoRR, (2022)Diverse Ensembles Improve Calibration., and . CoRR, (2020)