From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Quand être absent de mBERT n'est que le commencement : Gérer de nouvelles langues à l'aide de modèles de langues multilingues (When Being Unseen from mBERT is just the Beginning : Handling New Languages With Multilingual Language Models)., , , и . TALN-RECITAL, стр. 450-451. ATALA, (2022)Modélisation et analyse des coordinations elliptiques par l'exploitation dynamique des forêts de dérivation., и . TALN (Posters), стр. 609-618. ATALA, (2006)The French Social Media Bank: a Treebank of Noisy User Generated Content., , , , и . COLING, стр. 2441-2458. Indian Institute of Technology Bombay, (2012)Contextualized Diachronic Word Representations., и . LChange@ACL, стр. 35-47. Association for Computational Linguistics, (2019)Enhancing BERT for Lexical Normalization., , и . W-NUT@EMNLP, стр. 297-306. Association for Computational Linguistics, (2019)From Noisy Questions to Minecraft Texts: Annotation Challenges in Extreme Syntax Scenario., , и . NUT@COLING, стр. 13-23. The COLING 2016 Organizing Committee, (2016)Treebanking User-Generated Content: A Proposal for a Unified Representation in Universal Dependencies., , , , , , , , , и . LREC, стр. 5240-5250. European Language Resources Association, (2020)Cloaked Classifiers: Pseudonymization Strategies on Sensitive Classification Tasks., , , и . CoRR, (2024)Phonetic Normalization for Machine Translation of User Generated Content., , и . W-NUT@EMNLP, стр. 407-416. Association for Computational Linguistics, (2019)Can Character-based Language Models Improve Downstream Task Performances In Low-Resource And Noisy Language Scenarios?, , и . W-NUT, стр. 423-436. Association for Computational Linguistics, (2021)