From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation., и . NGT@ACL, стр. 110-118. Association for Computational Linguistics, (2020)Compressing BERT: Studying the Effects of Weight Pruning on Transfer Learning., , и . RepL4NLP@ACL, стр. 143-155. Association for Computational Linguistics, (2020)Explaining Sequence-Level Knowledge Distillation as Data-Augmentation for Neural Machine Translation., и . CoRR, (2019)Data and Parameter Scaling Laws for Neural Machine Translation., , и . EMNLP (1), стр. 5915-5922. Association for Computational Linguistics, (2021)