Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains., , , , , and . CoRR, (2020)M6: Multi-Modality-to-Multi-Modality Multitask Mega-transformer for Unified Pretraining., , , , , , , , and . KDD, page 3251-3261. ACM, (2021)Towards Knowledge-Based Personalized Product Description Generation in E-commerce., , , , , and . KDD, page 3040-3050. ACM, (2019)Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation., , , , , , and . ACL/IJCNLP (Findings), volume ACL/IJCNLP 2021 of Findings of ACL, page 4831-4843. Association for Computational Linguistics, (2021)Can Large Language Models Always Solve Easy Problems if They Can Solve Harder Ones?, , , , , , and . CoRR, (2024)Towards Knowledge-Based Recommender Dialog System., , , , , , and . EMNLP/IJCNLP (1), page 1803-1813. Association for Computational Linguistics, (2019)PRIS at Knowledge Base Population 2013., , , , , , , , , and . TAC, NIST, (2013)Meta-KD: A Meta Knowledge Distillation Framework for Language Model Compression across Domains., , , , , and . ACL/IJCNLP (1), page 3026-3036. Association for Computational Linguistics, (2021)Transferring General Multimodal Pretrained Models to Text Recognition., , , , , , and . ACL (Findings), page 588-597. Association for Computational Linguistics, (2023)M6: A Chinese Multimodal Pretrainer., , , , , , , , , and 15 other author(s). CoRR, (2021)