Author of the publication

Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT.

, , , and . EACL, page 2522-2532. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Separating the Wheat from the Chaff with BREAD: An open-source benchmark and metrics to detect redundancy in text., , and . CoRR, (2023)Multilingual BERT has an accent: Evaluating English influences on fluency in multilingual models., , and . EACL (Findings), page 1164-1170. Association for Computational Linguistics, (2023)Learning Music Helps You Read: Using Transfer to Study Linguistic Structure in Language Models., and . EMNLP (1), page 6829-6839. Association for Computational Linguistics, (2020)Pretraining on Non-linguistic Structure as a Tool for Analyzing Learning Bias in Language Models., and . CoRR, (2020)When classifying grammatical role, BERT doesn't care about word order... except when it matters., , and . ACL (2), page 636-643. Association for Computational Linguistics, (2022)Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT., , , and . EACL, page 2522-2532. Association for Computational Linguistics, (2021)Oolong: Investigating What Makes Transfer Learning Hard with Controlled Studies., , and . EMNLP, page 3280-3289. Association for Computational Linguistics, (2023)Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets., , , , , , , , , and 42 other author(s). Trans. Assoc. Comput. Linguistics, (2022)Injecting structural hints: Using language models to study inductive biases in language learning., and . EMNLP (Findings), page 8402-8413. Association for Computational Linguistics, (2023)