From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

XLM-K: Improving Cross-Lingual Language Model Pre-training with Multilingual Knowledge., , , и . AAAI, стр. 10840-10848. AAAI Press, (2022)BANG: Bridging Autoregressive and Non-autoregressive Generation with Large Scale Pretraining., , , , , , , , , и 2 other автор(ы). CoRR, (2020)LoftQ: LoRA-Fine-Tuning-Aware Quantization for Large Language Models., , , , , , и . CoRR, (2023)Reasoning Like Program Executors., , , , , , , , и . EMNLP, стр. 761-779. Association for Computational Linguistics, (2022)Soft-Labeled Contrastive Pre-Training for Function-Level Code Representation., , , , , , , , и . EMNLP (Findings), стр. 118-129. Association for Computational Linguistics, (2022)Scalable Learning to Optimize: A Learned Optimizer Can Train Big Models., , , , , и . ECCV (23), том 13683 из Lecture Notes in Computer Science, стр. 389-405. Springer, (2022)Large-scale L-BFGS using MapReduce., , и . NIPS, стр. 1332-1340. (2014)Adversarial Retriever-Ranker for Dense Text Retrieval., , , , , и . ICLR, OpenReview.net, (2022)Text Generation with Diffusion Language Models: A Pre-training Approach with Continuous Paragraph Denoise., , , , , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 21051-21064. PMLR, (2023)MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation., , , , , и . NAACL-HLT, стр. 1610-1623. Association for Computational Linguistics, (2022)