From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

What Language Model Architecture and Pretraining Objective Works Best for Zero-Shot Generalization?, , , , , , , и . ICML, том 162 из Proceedings of Machine Learning Research, стр. 22964-22984. PMLR, (2022)The Falcon Series of Open Language Models., , , , , , , , , и 4 other автор(ы). CoRR, (2023)Is the Number of Trainable Parameters All That Actually Matters?, , , и . ICBINB@NeurIPS, том 163 из Proceedings of Machine Learning Research, стр. 27-32. PMLR, (2021)What Language Model to Train if You Have One Million GPU Hours?, , , , , , , , , и 9 other автор(ы). CoRR, (2022)LightOn Optical Processing Unit : Scaling-up AI and HPC with a Non von Neumann co-processor., , , , , , , , , и 7 other автор(ы). HCS, стр. 1-11. IEEE, (2021)What Language Model to Train if You Have One Million GPU Hours?, , , , , , , , , и 8 other автор(ы). EMNLP (Findings), стр. 765-782. Association for Computational Linguistics, (2022)BLOOM: A 176B-Parameter Open-Access Multilingual Language Model., , , , , , , , , и 39 other автор(ы). CoRR, (2022)Photonic co-processors in HPC: Using LightOn OPUs for Randomized Numerical Linear Algebra., , , , , , , , и . HCS, стр. 1-9. IEEE, (2021)