From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Improved knowledge distillation by utilizing backward pass knowledge in neural networks., , и . CoRR, (2023)Bilingual-GAN: A Step Towards Parallel Text Generation., , , , и . CoRR, (2019)Learning functions on multiple sets using multi-set transformers., , , , и . UAI, том 180 из Proceedings of Machine Learning Research, стр. 1760-1770. PMLR, (2022)SALSA-TEXT: Self Attentive Latent Space Based Adversarial Text Generation., , , и . Canadian AI, том 11489 из Lecture Notes in Computer Science, стр. 119-131. Springer, (2019)TextKD-GAN: Text Generation Using Knowledge Distillation and Generative Adversarial Networks., и . Canadian AI, том 11489 из Lecture Notes in Computer Science, стр. 107-118. Springer, (2019)Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher., , , , , и . COLING, стр. 4714-4727. International Committee on Computational Linguistics, (2022)From Unsupervised Machine Translation to Adversarial Text Generation., , , , и . ICASSP, стр. 8194-8198. IEEE, (2020)End-to-End Self-Debiasing Framework for Robust NLU Training., , , и . ACL/IJCNLP (Findings), том ACL/IJCNLP 2021 из Findings of ACL, стр. 1923-1929. Association for Computational Linguistics, (2021)On the utility of enhancing BERT syntactic bias with Token Reordering Pretraining., , , , , и . CoNLL, стр. 165-182. Association for Computational Linguistics, (2023)S2D: Sorted Speculative Decoding For More Efficient Deployment of Nested Large Language Models., , , , , , , и . CoRR, (2024)