From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Training with Quantization Noise for Extreme Model Compression, , , , , , и . (2020)cite arxiv:2004.07320.Improving Wikipedia verifiability with AI., , , , , , , , , и 4 other автор(ы). Nat. Mac. Intell., 5 (10): 1142-1148 (октября 2023)Looking for ELMo's friends: Sentence-Level Pretraining Beyond Language Modeling., , , , , , , , , и 6 other автор(ы). CoRR, (2018)PaSS: Parallel Speculative Sampling., , и . CoRR, (2023)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . NAACL-HLT, стр. 5408-5418. Association for Computational Linguistics, (2021)Adaptive Attention Span in Transformers., , , и . ACL (1), стр. 331-335. Association for Computational Linguistics, (2019)Flashlight: Enabling Innovation in Tools for Machine Learning., , , , , , , , , и 4 other автор(ы). ICML, том 162 из Proceedings of Machine Learning Research, стр. 10557-10574. PMLR, (2022)Training with Quantization Noise for Extreme Model Compression., , , , , , и . ICLR, OpenReview.net, (2021)End-to-end ASR: from Supervised to Semi-Supervised Learning with Modern Architectures., , , , , , , , и . CoRR, (2019)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , и . CoRR, (2020)