Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

XLM-E: Cross-lingual Language Model Pre-training via ELECTRA., , , , , , , and . CoRR, (2021)On the Representation Collapse of Sparse Mixture of Experts., , , , , , , , , and . CoRR, (2022)Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task., , , , , , , , , and 1 other author(s). CoRR, (2021)Dispersion Based Similarity for Mining Similar Papers in Citation Network., and . ICDM Workshops, page 524-531. IEEE Computer Society, (2015)Language Is Not All You Need: Aligning Perception with Language Models., , , , , , , , , and 8 other author(s). NeurIPS, (2023)Bootstrapping a high quality multilingual multimodal dataset for Bletchley., , , , , and . ACML, volume 189 of Proceedings of Machine Learning Research, page 738-753. PMLR, (2022)DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders., , , , , , , , and . CoRR, (2021)mT6: Multilingual Pretrained Text-to-Text Transformer with Translation Pairs., , , , , , , , and . EMNLP (1), page 1671-1683. Association for Computational Linguistics, (2021)On the Representation Collapse of Sparse Mixture of Experts., , , , , , , , , and 2 other author(s). NeurIPS, (2022)XLM-E: Cross-lingual Language Model Pre-training via ELECTRA., , , , , , , , , and 1 other author(s). ACL (1), page 6170-6182. Association for Computational Linguistics, (2022)