Author of the publication

Don't Stop Fine-Tuning: On Training Regimes for Few-Shot Cross-Lingual Transfer with Multilingual Language Models

, , and . Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, page 10725--10742. Abu Dhabi, United Arab Emirates, Association for Computational Linguistics, (December 2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Knowledge Distillation vs. Pretraining from Scratch under a Fixed (Computation) Budget., , , and . CoRR, (2024)Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ Languages., , , and . CoRR, (2024)Don't Stop Fine-Tuning: On Training Regimes for Few-Shot Cross-Lingual Transfer with Multilingual Language Models, , and . Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, page 10725--10742. Abu Dhabi, United Arab Emirates, Association for Computational Linguistics, (December 2022)Don't Stop Fine-Tuning: On Training Regimes for Few-Shot Cross-Lingual Transfer with Multilingual Language Models., , and . EMNLP, page 10725-10742. Association for Computational Linguistics, (2022)Free Lunch: Robust Cross-Lingual Transfer via Model Checkpoint Averaging., , and . ACL (1), page 5712-5730. Association for Computational Linguistics, (2023)Evaluating the Ability of LLMs to Solve Semantics-Aware Process Mining Tasks., , , and . ICPM, page 9-16. IEEE, (2024)SLICER: Sliced Fine-Tuning for Low-Resource Cross-Lingual Transfer for Named Entity Recognition., , and . EMNLP, page 10775-10785. Association for Computational Linguistics, (2022)News Without Borders: Domain Adaptation of Multilingual Sentence Embeddings for Cross-lingual News Recommendation., , , and . CoRR, (2024)SEAGLE: A Platform for Comparative Evaluation of Semantic Encoders for Information Retrieval., , , and . EMNLP/IJCNLP (3), page 199-204. Association for Computational Linguistics, (2019)One For All & All For One: Bypassing Hyperparameter Tuning with Model Averaging for Cross-Lingual Transfer., , and . EMNLP (Findings), page 12186-12193. Association for Computational Linguistics, (2023)