From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

MoEC: Mixture of Expert Clusters., , , и . CoRR, (2022)LayoutLM: Pre-training of Text and Layout for Document Image Understanding., , , , , и . CoRR, (2019)MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers., , , , и . CoRR, (2020)MoEC: Mixture of Expert Clusters., , , и . AAAI, стр. 13807-13815. AAAI Press, (2023)Transfer Log-based Anomaly Detection with Pseudo Labels., , , , , , и . CNSM, стр. 1-5. IEEE, (2020)Black-box Attacks to Log-based Anomaly Detection., , , , и . CNSM, стр. 310-316. IEEE, (2022)Neural Document Summarization by Jointly Learning to Score and Select Sentences., , , , , и . ACL (1), стр. 654-663. Association for Computational Linguistics, (2018)Beyond English-Centric Bitexts for Better Multilingual Language Representation Learning., , , , , , , и . ACL (1), стр. 15354-15373. Association for Computational Linguistics, (2023)The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits., , , , , , , , , и . CoRR, (2024)Pre-training Language Model as a Multi-perspective Course Learner., , , , , , , , и . ACL (Findings), стр. 114-128. Association for Computational Linguistics, (2023)