Author of the publication

bert2BERT: Towards Reusable Pretrained Language Models.

, , , , , , , , , and . ACL (1), page 2134-2148. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Comparing Three Methods of Selecting Training Samples in Supervised Classification of Multispectral Remote Sensing Images., , , , , and . Sensors, 23 (20): 8530 (October 2023)ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning., , , , , , , , and . ACL/IJCNLP (1), page 3350-3363. Association for Computational Linguistics, (2021)FPT: Improving Prompt Tuning Efficiency via Progressive Training., , , , , , and . EMNLP (Findings), page 6877-6887. Association for Computational Linguistics, (2022)StableToolBench: Towards Stable Large-Scale Benchmarking on Tool Learning of Large Language Models., , , , , , , , and . CoRR, (2024)AgentVerse: Facilitating Multi-Agent Collaboration and Exploring Emergent Behaviors in Agents., , , , , , , , , and 3 other author(s). CoRR, (2023)On Transferability of Prompt Tuning for Natural Language Processing., , , , , , , , , and 3 other author(s). NAACL-HLT, page 3949-3969. Association for Computational Linguistics, (2022)Knowledge Inheritance for Pre-trained Language Models., , , , , , , , , and 1 other author(s). NAACL-HLT, page 3921-3937. Association for Computational Linguistics, (2022)Human Emotion Knowledge Representation Emerges in Large Language Model and Supports Discrete Emotion Inference., , , , , , , , , and 1 other author(s). CoRR, (2023)ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning., , , , , , , , and . CoRR, (2020)Recyclable Tuning for Continual Pre-training., , , , , , , , and . ACL (Findings), page 11403-11426. Association for Computational Linguistics, (2023)