From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation., , и . CoRR, (2020)BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning., и . ICML, том 97 из Proceedings of Machine Learning Research, стр. 5986-5995. PMLR, (2019)Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters., , и . WMT@EMNLP, стр. 578-598. Association for Computational Linguistics, (2021)Taken out of context: On measuring situational awareness in LLMs., , , , , , , и . CoRR, (2023)Robustification of Multilingual Language Models to Real-world Noise in Crosslingual Zero-shot Settings with Robust Contrastive Pretraining., , , , и . EACL, стр. 1367-1383. Association for Computational Linguistics, (2023)Deep Transformers with Latent Depth., , , и . NeurIPS, (2020)The Reversal Curse: LLMs trained on Ä is B" fail to learn "B is A"., , , , , , и . CoRR, (2023)Robustification of Multilingual Language Models to Real-world Noise with Robust Contrastive Pretraining., , , , и . CoRR, (2022)Recipes for Adapting Pre-trained Monolingual and Multilingual Models to Machine Translation., , и . EACL, стр. 3440-3453. Association for Computational Linguistics, (2021)Diverse Ensembles Improve Calibration., и . CoRR, (2020)