Author of the publication

HalOmi: A Manually Annotated Benchmark for Multilingual Hallucination and Omission Detection in Machine Translation.

, , , , , , , , and . EMNLP, page 638-653. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Information-Theoretic Probing with Minimum Description Length., and . EMNLP (1), page 183-196. Association for Computational Linguistics, (2020)Detecting and Mitigating Hallucinations in Machine Translation: Model Internal Workings Alone Do Well, Sentence Similarity Even Better., , , and . ACL (1), page 36-50. Association for Computational Linguistics, (2023)LM Transparency Tool: Interactive Tool for Analyzing Transformer Language Models., , , and . CoRR, (2024)BPE-Dropout: Simple and Effective Subword Regularization., , and . ACL, page 1882-1892. Association for Computational Linguistics, (2020)Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation., , and . ACL/IJCNLP (1), page 1126-1140. Association for Computational Linguistics, (2021)HalOmi: A Manually Annotated Benchmark for Multilingual Hallucination and Omission Detection in Machine Translation., , , , , , , , and . EMNLP, page 638-653. Association for Computational Linguistics, (2023)Know When To Stop: A Study of Semantic Drift in Text Generation., , , and . CoRR, (2024)Looking for a Needle in a Haystack: A Comprehensive Study of Hallucinations in Neural Machine Translation., , and . EACL, page 1059-1075. Association for Computational Linguistics, (2023)Context-Aware Monolingual Repair for Neural Machine Translation., , and . EMNLP/IJCNLP (1), page 877-886. Association for Computational Linguistics, (2019)Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT., , and . EMNLP (1), page 8478-8491. Association for Computational Linguistics, (2021)