Author of the publication

Can Pretrained Language Models (Yet) Reason Deductively?

, , , , and . EACL, page 1439-1454. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Text Mining for Literature Review and Knowledge Discovery in Cancer Risk Assessment and Research, , , , , and . PLoS ONE, 7 (4): e33427 (April 2012)Analyzing and Adapting Large Language Models for Few-Shot Multilingual NLU: Are We There Yet?, , and . CoRR, (2024)Semantic Data Set Construction from Human Clustering and Spatial Arrangement., , , , , and . Comput. Linguistics, 47 (1): 69-116 (2021)SemEval-2020 Task 2: Predicting Multilingual and Cross-Lingual (Graded) Lexical Entailment., , , and . SemEval@COLING, page 24-35. International Committee for Computational Linguistics, (2020)Show Some Love to Your n-grams: A Bit of Progress and Stronger n-gram Language Modeling Baselines., , , and . NAACL-HLT (1), page 4113-4118. Association for Computational Linguistics, (2019)A Tensor-based Factorization Model of Semantic Compositionality., , and . HLT-NAACL, page 1142-1151. The Association for Computational Linguistics, (2013)XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning., , , , , and . EMNLP (1), page 2362-2376. Association for Computational Linguistics, (2020)Learning Abstract Concept Embeddings from Multi-Modal Data: Since You Probably Can't See What I Mean., and . EMNLP, page 255-265. ACL, (2014)Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders., , , and . EMNLP (1), page 1442-1459. Association for Computational Linguistics, (2021)How to Train good Word Embeddings for Biomedical NLP., , , and . BioNLP@ACL, page 166-174. Association for Computational Linguistics, (2016)