Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

EVA2.0: Investigating Open-domain Chinese Dialogue Systems with Large-scale Pre-training., , , , , , , , , and 1 other author(s). Mach. Intell. Res., 20 (2): 207-219 (April 2023)CPM: A Large-scale Generative Chinese Pre-trained Language Model., , , , , , , , , and 15 other author(s). CoRR, (2020)Pre-Training to Learn in Context., , , and . ACL (1), page 4849-4870. Association for Computational Linguistics, (2023)Pre-trained models: Past, present and future., , , , , , , , , and 14 other author(s). AI Open, (2021)Knowledge Distillation of Large Language Models, , , and . (2023)cite arxiv:2306.08543Comment: 20 pages, 12 figures.Pre-Trained Models: Past, Present and Future., , , , , , , , , and 12 other author(s). CoRR, (2021)Instruction Pre-Training: Language Models are Supervised Multitask Learners., , , , , and . CoRR, (2024)CPM-2: Large-scale Cost-effective Pre-trained Language Models., , , , , , , , , and 9 other author(s). CoRR, (2021)Train No Evil: Selective Masking for Task-Guided Pre-Training., , , , and . EMNLP (1), page 6966-6974. Association for Computational Linguistics, (2020)Learning Instructions with Unlabeled Data for Zero-Shot Cross-Task Generalization., , , and . EMNLP, page 1617-1634. Association for Computational Linguistics, (2022)