Author of the publication

Linear Transformers Are Secretly Fast Weight Programmers.

, , and . ICML, volume 139 of Proceedings of Machine Learning Research, page 9355-9366. PMLR, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Improving Baselines in the Wild., , , and . CoRR, (2021)The Languini Kitchen: Enabling Language Modelling Research at Different Scales of Compute., , , , , , , and . CoRR, (2023)Learning Associative Inference Using Fast Weight Memory., , and . CoRR, (2020)Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving., , , , , and . CoRR, (2019)Ancient Roman Coin Recognition in the Wild Using Deep Learning Based Recognition of Artistically Depicted Face Profiles., and . ICCV Workshops, page 2898-2906. IEEE Computer Society, (2017)Solving Quantitative Reasoning Problems with Language Models., , , , , , , , , and 4 other author(s). NeurIPS, (2022)Block-Recurrent Transformers., , , , and . NeurIPS, (2022)Mindstorms in Natural Language-Based Societies of Mind., , , , , , , , , and 16 other author(s). CoRR, (2023)Going Beyond Linear Transformers with Recurrent Fast Weight Programmers., , , and . NeurIPS, page 7703-7717. (2021)Learning to Reason with Third Order Tensor Products., and . NeurIPS, page 10003-10014. (2018)