Author of the publication

CTL++: Evaluating Generalization on Never-Seen Compositional Patterns of Known Functions, and Compatibility of Neural Representations.

, , and . EMNLP, page 9758-9767. Association for Computational Linguistics, (2022)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

RWTH ASR Systems for LibriSpeech: Hybrid vs Attention., , , , , , , and . INTERSPEECH, page 231-235. ISCA, (2019)Neural Differential Equations for Learning to Program Neural Nets Through Continuous Learning Rules., , and . NeurIPS, (2022)The Dual Form of Neural Networks Revisited: Connecting Test Time Predictions to Training Patterns via Spotlights of Attention., , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 9639-9659. PMLR, (2022)A Modern Self-Referential Weight Matrix That Learns to Modify Itself., , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 9660-9677. PMLR, (2022)Linear Transformers Are Secretly Fast Weight Programmers., , and . ICML, volume 139 of Proceedings of Machine Learning Research, page 9355-9366. PMLR, (2021)How Much Self-Attention Do We Need? Trading Attention for Feed-Forward Layers., , , and . ICASSP, page 6154-6158. IEEE, (2020)On the Choice of Modeling Unit for Sequence-to-Sequence Speech Recognition., , , , , and . INTERSPEECH, page 3800-3804. ISCA, (2019)Practical Computational Power of Linear Transformers and Their Recurrent and Self-Referential Extensions., , and . EMNLP, page 9455-9465. Association for Computational Linguistics, (2023)MoEUT: Mixture-of-Experts Universal Transformers., , , , and . CoRR, (2024)The Rwth Asr System for Ted-Lium Release 2: Improving Hybrid Hmm With Specaugment., , , , , and . ICASSP, page 7839-7843. IEEE, (2020)