From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Attention Temperature Matters in Abstractive Summarization Distillation., , , и . CoRR, (2021)THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption., , , , , , , , и . CoRR, (2022)Image as a Foreign Language: BEIT Pretraining for Vision and Vision-Language Tasks., , , , , , , , , и 1 other автор(ы). CVPR, стр. 19175-19186. IEEE, (2023)BEiT v2: Masked Image Modeling with Vector-Quantized Visual Tokenizers., , , , и . CoRR, (2022)MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers., , , , , и . NeurIPS, (2020)MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing Pretrained Transformers., , , , и . ACL/IJCNLP (Findings), том ACL/IJCNLP 2021 из Findings of ACL, стр. 2140-2151. Association for Computational Linguistics, (2021)Attention Temperature Matters in Abstractive Summarization Distillation., , , и . ACL (1), стр. 127-141. Association for Computational Linguistics, (2022)Neural Melody Composition from Lyrics., , , , , , , и . NLPCC (1), том 11838 из Lecture Notes in Computer Science, стр. 499-511. Springer, (2019)BEiT: BERT Pre-Training of Image Transformers., , и . CoRR, (2021)A Unified View of Masked Image Modeling., , , , и . Trans. Mach. Learn. Res., (2023)