Author of the publication

Self-training Improves Pre-training for Natural Language Understanding.

, , , , , , , and . NAACL-HLT, page 5408-5418. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Multilingual Speech Translation from Efficient Finetuning of Pretrained Models., , , , , , , , and . ACL/IJCNLP (1), page 827-838. Association for Computational Linguistics, (2021)CCNet: Extracting High Quality Monolingual Datasets from Web Crawl Data., , , , , , and . LREC, page 4003-4012. European Language Resources Association, (2020)SentEval: An Evaluation Toolkit for Universal Sentence Representations., and . LREC, European Language Resources Association (ELRA), (2018)Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning., , , and . (2020)cite arxiv:2011.01403.Textually Pretrained Speech Language Models., , , , , , , , , and 2 other author(s). CoRR, (2023)Unsupervised Speech Recognition., , , and . NeurIPS, page 27826-27839. (2021)XLS-R: Self-supervised Cross-lingual Speech Representation Learning at Scale., , , , , , , , , and 3 other author(s). INTERSPEECH, page 2278-2282. ISCA, (2022)Unsupervised Cross-Lingual Representation Learning for Speech Recognition., , , , and . Interspeech, page 2426-2430. ISCA, (2021)Learning distributed representations of sentences using neural networks. (Apprentissage et applications de représentations multilingues distribuées).. Le Mans University, France, (2019)Self-training Improves Pre-training for Natural Language Understanding., , , , , , , and . NAACL-HLT, page 5408-5418. Association for Computational Linguistics, (2021)