Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

EVA2.0: Investigating Open-domain Chinese Dialogue Systems with Large-scale Pre-training., , , , , , , , , and 1 other author(s). Mach. Intell. Res., 20 (2): 207-219 (April 2023)Pre-Training to Learn in Context., , , and . ACL (1), page 4849-4870. Association for Computational Linguistics, (2023)MiniLLM: Knowledge Distillation of Large Language Models., , , and . ICLR, OpenReview.net, (2024)CPM: A large-scale generative Chinese Pre-trained language model., , , , , , , , , and 15 other author(s). AI Open, (2021)EVA: An Open-Domain Chinese Dialogue System with Large-Scale Generative Pre-Training., , , , , , , , , and 4 other author(s). CoRR, (2021)CPM: A Large-scale Generative Chinese Pre-trained Language Model., , , , , , , , , and 15 other author(s). CoRR, (2020)Pre-Trained Models: Past, Present and Future., , , , , , , , , and 12 other author(s). CoRR, (2021)Knowledge Distillation of Large Language Models, , , and . (2023)cite arxiv:2306.08543Comment: 20 pages, 12 figures.Instruction Pre-Training: Language Models are Supervised Multitask Learners., , , , , and . CoRR, (2024)CPM-2: Large-scale Cost-effective Pre-trained Language Models., , , , , , , , , and 9 other author(s). CoRR, (2021)