From post

Great Power, Great Responsibility: Recommendations for Reducing Energy for Training Language Models.

, , , , , и . NAACL-HLT (Findings), стр. 1962-1970. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

The MIT Supercloud Workload Classification Challenge., , , , , , , , , и 19 other автор(ы). IPDPS Workshops, стр. 708-714. IEEE, (2022)Benchmarking Resource Usage for Efficient Distributed Deep Learning., , , , , , , , и . HPEC, стр. 1-8. IEEE, (2022)A Green(er) World for A.I., , , , , , , , и . IPDPS Workshops, стр. 742-750. IEEE, (2022)Energy-aware neural architecture selection and hyperparameter optimization., , , , , , , и . IPDPS Workshops, стр. 732-741. IEEE, (2022)SupSiam: Non-contrastive Auxiliary Loss for Learning from Molecular Conformers., , , , , и . CoRR, (2023)A Green(er) World for A.I., , , , , , , , и . CoRR, (2023)Great Power, Great Responsibility: Recommendations for Reducing Energy for Training Language Models., , , , , и . NAACL-HLT (Findings), стр. 1962-1970. Association for Computational Linguistics, (2022)Loss Curve Approximations for Fast Neural Architecture Ranking & Training Elasticity Estimation., , , и . IPDPS Workshops, стр. 715-723. IEEE, (2022)Protein Discovery with Discrete Walk-Jump Sampling., , , , , , , , , и 3 other автор(ы). ICLR, OpenReview.net, (2024)