Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Discovering Representation Sprachbund For Multilingual Pre-Training., , , , , , and . EMNLP (Findings), page 881-894. Association for Computational Linguistics, (2021)Resource Central: Understanding and Predicting Workloads for Improved Resource Management in Large Cloud Platforms., , , , , and . SOSP, page 153-167. ACM, (2017)Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task., , , , , , , , , and 1 other author(s). CoRR, (2021)Scalable and Efficient MoE Training for Multitask Multilingual Models., , , , , , , , and . CoRR, (2021)Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers., , , , and . CoRR, (2022)DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders., , , , , , , , and . CoRR, (2021)Toward ML-centric cloud platforms., , , , , , , , , and . Commun. ACM, 63 (2): 50-59 (2020)Improving Multilingual Translation by Representation and Gradient Regularization., , , , , and . EMNLP (1), page 7266-7279. Association for Computational Linguistics, (2021)Gating Dropout: Communication-efficient Regularization for Sparsely Activated Transformers., , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 13782-13792. PMLR, (2022)XLM-T: Scaling up Multilingual Machine Translation with Pretrained Cross-lingual Transformer Encoders., , , , , , , , , and 3 other author(s). CoRR, (2020)