From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Secure Distributed Training at Scale., , , и . CoRR, (2021)Training Transformers Together., , , , , , , и . NeurIPS (Competition and Demos), том 176 из Proceedings of Machine Learning Research, стр. 335-342. PMLR, (2021)SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient., , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 29416-29440. PMLR, (2023)Petals: Collaborative Inference and Fine-tuning of Large Models., , , , , , , и . ACL (demo), стр. 558-568. Association for Computational Linguistics, (2023)Training Transformers Together., , , , , , , и . CoRR, (2022)Distributed Deep Learning In Open Collaborations., , , , , , , , , и 6 other автор(ы). NeurIPS, стр. 7879-7897. (2021)Secure Distributed Training at Scale., , , и . ICML, том 162 из Proceedings of Machine Learning Research, стр. 7679-7739. PMLR, (2022)Petals: Collaborative Inference and Fine-tuning of Large Models., , , , , , , и . CoRR, (2022)