From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Efficient Self-supervised Learning with Contextualized Target Representations for Vision, Speech and Language, , , и . https://ai.facebook.com/blog/ai-self-supervised-learning-data2vec/, (2022)cite arxiv:2212.07525.Facebook FAIR's WMT19 News Translation Task Submission., , , , , и . WMT (2), стр. 314-319. Association for Computational Linguistics, (2019)Neural Network-based Word Alignment through Score Aggregation., , и . WMT, стр. 66-73. The Association for Computer Linguistics, (2016)Cloze-driven Pretraining of Self-attention Networks., , , , и . EMNLP/IJCNLP (1), стр. 5359-5368. Association for Computational Linguistics, (2019)Simple and Effective Noisy Channel Modeling for Neural Machine Translation., , и . EMNLP/IJCNLP (1), стр. 5695-5700. Association for Computational Linguistics, (2019)wav2vec 2.0: A Framework for Self-Supervised Learning of Speech Representations., , , и . NeurIPS, (2020)Multilingual Speech Translation from Efficient Finetuning of Pretrained Models., , , , , , , , и . ACL/IJCNLP (1), стр. 827-838. Association for Computational Linguistics, (2021)A Comparison of Loopy Belief Propagation and Dual Decomposition for Integrated CCG Supertagging and Parsing., и . ACL, стр. 470-480. The Association for Computer Linguistics, (2011)Efficient Self-supervised Learning with Contextualized Target Representations for Vision, Speech and Language., , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 1416-1429. PMLR, (2023)Expected F-Measure Training for Shift-Reduce Parsing with Recurrent Neural Networks., , и . HLT-NAACL, стр. 210-220. The Association for Computational Linguistics, (2016)