From post

CK-Transformer: Commonsense Knowledge Enhanced Transformers for Referring Expression Comprehension.

, , , и . EACL (Findings), стр. 2541-2551. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Scientific and Creative Analogies in Pretrained Language Models., , , и . EMNLP (Findings), стр. 2094-2100. Association for Computational Linguistics, (2022)Stepmothers are mean and academics are pretentious: What do pretrained language models learn about you?, , и . EMNLP (1), стр. 1477-1491. Association for Computational Linguistics, (2021)How Metaphors Impact Political Discourse: A Large-Scale Topic-Agnostic Study Using Neural Metaphor Detection., , и . ICWSM, стр. 503-512. AAAI Press, (2021)Meta-Learning for Fast Cross-Lingual Adaptation in Dependency Parsing., , , , , , и . ACL (1), стр. 8503-8520. Association for Computational Linguistics, (2022)Metaphor Understanding Challenge Dataset for LLMs., , , и . CoRR, (2024)Probing LLMs for Joint Encoding of Linguistic Categories., , , , , , и . EMNLP (Findings), стр. 7158-7179. Association for Computational Linguistics, (2023)Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models., , , , , , , , , и 440 other автор(ы). Trans. Mach. Learn. Res., (2023)A framework for annotating and modelling intentions behind metaphor use., , и . CoRR, (2024)A Multimodal Framework for the Detection of Hateful Memes., , , , , , и . CoRR, (2020)How do languages influence each other? Studying cross-lingual data sharing during LLM fine-tuning., , и . CoRR, (2023)