Author of the publication

TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities.

, , , , , , , , , , , , , , , , , , , , , , , , , and . ACL (demo), page 217-225. Association for Computational Linguistics, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

TextNAS: A Neural Architecture Search Space Tailored for Text Representation., , , , , , , , , and . AAAI, page 9242-9249. AAAI Press, (2020)A Simple and Effective Method to Improve Zero-Shot Cross-Lingual Transfer Learning., , , , , , , , and . COLING, page 4372-4380. International Committee on Computational Linguistics, (2022)Parameter-efficient Continual Learning Framework in Industrial Real-time Text Classification System., , , , , , , , , and . NAACL-HLT (Industry Papers), page 315-323. Association for Computational Linguistics, (2022)Create and Find Flatness: Building Flat Training Spaces in Advance for Continual Learning., , , , , and . ECAI, volume 372 of Frontiers in Artificial Intelligence and Applications, page 2138-2145. IOS Press, (2023)AutoADR: Automatic Model Design for Ad Relevance., , , , , , , , , and . CIKM, page 2365-2372. ACM, (2020)Multiple UCAVs mission assignment based on modified Gravitational Search., and . ICCA, page 540-545. IEEE, (2014)Improving BERT with Self-Supervised Attention., , , , , , , and . CoRR, (2020)Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees., , , , , , and . EACL, page 3011-3020. Association for Computational Linguistics, (2021)TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities., , , , , , , , , and 16 other author(s). ACL (demo), page 217-225. Association for Computational Linguistics, (2023)TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities., , , , , , , , , and 16 other author(s). CoRR, (2022)