Author of the publication

How to Train your HIPPO: State Space Models with Generalized Orthogonal Basis Projections.

, , , , and . ICLR, OpenReview.net, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

How to Train your HIPPO: State Space Models with Generalized Orthogonal Basis Projections., , , , and . ICLR, OpenReview.net, (2023)Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models., , , , , , , , , and 7 other author(s). CoRR, (2024)Pretraining Without Attention., , , and . EMNLP (Findings), page 58-69. Association for Computational Linguistics, (2023)Caduceus: Bi-Directional Equivariant Long-Range DNA Sequence Modeling., , , , , and . CoRR, (2024)S4ND: Modeling Images and Videos as Multidimensional Signals Using State Spaces., , , , , , , and . CoRR, (2022)Towards a General Purpose CNN for Long Range Dependencies in ND., , , , , , and . CoRR, (2022)Pretraining Without Attention., , , and . CoRR, (2022)It's Raw! Audio Generation with State-Space Models., , , and . ICML, volume 162 of Proceedings of Machine Learning Research, page 7616-7633. PMLR, (2022)Diagonal State Spaces are as Effective as Structured State Spaces., , and . NeurIPS, (2022)Learning Compressed Transforms with Low Displacement Rank., , , , and . NeurIPS, page 9066-9078. (2018)