From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Secure Distributed Training at Scale., , , и . CoRR, (2021)Training Transformers Together., , , , , , , и . NeurIPS (Competition and Demos), том 176 из Proceedings of Machine Learning Research, стр. 335-342. PMLR, (2021)Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices., , , и . NeurIPS, стр. 18195-18211. (2021)Hypernymy Understanding Evaluation of Text-to-Image Models via WordNet Hierarchy., и . CoRR, (2023)SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient., , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 29416-29440. PMLR, (2023)Training Transformers Together., , , , , , , и . CoRR, (2022)Towards Crowdsourced Training of Large Neural Networks using Decentralized Mixture-of-Experts., и . NeurIPS, (2020)The Hallucinations Leaderboard - An Open Effort to Measure Hallucinations in Large Language Models., , , , , , , , , и 1 other автор(ы). CoRR, (2024)Embedding Words in Non-Vector Space with Unsupervised Graph Learning., , , и . EMNLP (1), стр. 7317-7331. Association for Computational Linguistics, (2020)Petals: Collaborative Inference and Fine-tuning of Large Models., , , , , , , и . CoRR, (2022)