From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

EVA2.0: Investigating Open-domain Chinese Dialogue Systems with Large-scale Pre-training., , , , , , , , , и 1 other автор(ы). Mach. Intell. Res., 20 (2): 207-219 (апреля 2023)Pre-Training to Learn in Context., , , и . ACL (1), стр. 4849-4870. Association for Computational Linguistics, (2023)CPM: A large-scale generative Chinese Pre-trained language model., , , , , , , , , и 15 other автор(ы). AI Open, (2021)CPM: A Large-scale Generative Chinese Pre-trained Language Model., , , , , , , , , и 15 other автор(ы). CoRR, (2020)EVA: An Open-Domain Chinese Dialogue System with Large-Scale Generative Pre-Training., , , , , , , , , и 4 other автор(ы). CoRR, (2021)MiniLLM: Knowledge Distillation of Large Language Models., , , и . ICLR, OpenReview.net, (2024)Knowledge Distillation of Large Language Models, , , и . (2023)cite arxiv:2306.08543Comment: 20 pages, 12 figures.Pre-Trained Models: Past, Present and Future., , , , , , , , , и 12 other автор(ы). CoRR, (2021)Instruction Pre-Training: Language Models are Supervised Multitask Learners., , , , , и . CoRR, (2024)CPM-2: Large-scale Cost-effective Pre-trained Language Models., , , , , , , , , и 9 other автор(ы). CoRR, (2021)