Author of the publication

Training Transformers Together.

, , , , , , , and . NeurIPS (Competition and Demos), volume 176 of Proceedings of Machine Learning Research, page 335-342. PMLR, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Training Transformers Together., , , , , , , and . CoRR, (2022)Secure Distributed Training at Scale., , , and . CoRR, (2021)Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees., , , , and . CoRR, (2021)Training Transformers Together., , , , , , , and . NeurIPS (Competition and Demos), volume 176 of Proceedings of Machine Learning Research, page 335-342. PMLR, (2021)A critical look at the evaluation of GNNs under heterophily: Are we really making progress?, , , , and . ICLR, OpenReview.net, (2023)SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient., , , and . ICML, volume 202 of Proceedings of Machine Learning Research, page 29416-29440. PMLR, (2023)Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees., , , , and . NeurIPS, (2022)Distributed Deep Learning In Open Collaborations., , , , , , , , , and 6 other author(s). NeurIPS, page 7879-7897. (2021)Secure Distributed Training at Scale., , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 7679-7739. PMLR, (2022)