From post

Autoregressive Pre-training Model-Assisted Low-Resource Neural Machine Translation.

, , , и . PRICAI (2), том 13032 из Lecture Notes in Computer Science, стр. 46-57. Springer, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Low-Resource Machine Translation Based on Asynchronous Dynamic Programming., , , , и . CCL, том 12869 из Lecture Notes in Computer Science, стр. 16-28. Springer, (2021)Improving the Robustness of Low-Resource Neural Machine Translation with Adversarial Examples., , , , , , и . CCMT, том 1671 из Communications in Computer and Information Science, стр. 60-71. Springer, (2022)Autoregressive Pre-training Model-Assisted Low-Resource Neural Machine Translation., , , и . PRICAI (2), том 13032 из Lecture Notes in Computer Science, стр. 46-57. Springer, (2021)Exploring the Advantages of Corpus in Neural Machine Translation of Agglutinative Language., , , и . ICANN (4), том 11730 из Lecture Notes in Computer Science, стр. 326-336. Springer, (2019)Neural Machine Translation Based on Prioritized Experience Replay., , , и . ICANN (2), том 12397 из Lecture Notes in Computer Science, стр. 358-368. Springer, (2020)Hot-Start Transfer Learning Combined with Approximate Distillation for Mongolian-Chinese Neural Machine Translation., , , , , , и . CCMT, том 1671 из Communications in Computer and Information Science, стр. 12-23. Springer, (2022)SITD-NMT: Synchronous Inference NMT with Turing Re-Translation Detection., , , , , , и . IJCNN, стр. 1-9. IEEE, (2024)A Review of Mongolian Neural Machine Translation from the Perspective of Training., , , , , и . IJCNN, стр. 1-10. IEEE, (2024)Dynamic Mask Curriculum Learning for Non-Autoregressive Neural Machine Translation., , , , , , и . CCMT, том 1671 из Communications in Computer and Information Science, стр. 72-81. Springer, (2022)Low-Resource Neural Machine Translation Using XLNet Pre-training Model., , , и . ICANN (5), том 12895 из Lecture Notes in Computer Science, стр. 503-514. Springer, (2021)