Author of the publication

Learning a Fourier Transform for Linear Relative Positional Encodings in Transformers.

, , , , , , , , , and . AISTATS, volume 238 of Proceedings of Machine Learning Research, page 2278-2286. PMLR, (2024)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Functional Interpolation for Relative Positions Improves Long Context Transformers., , , , , , , , , and . CoRR, (2023)Learning Physics-Informed Neural Networks without Stacked Back-propagation., , , , , , , and . AISTATS, volume 206 of Proceedings of Machine Learning Research, page 3034-3047. PMLR, (2023)Learning Physics-Informed Neural Networks without Stacked Back-propagation., , , , , , , and . CoRR, (2022)Stable, Fast and Accurate: Kernelized Attention with Relative Positional Encoding., , , , , , , , and . NeurIPS, page 22795-22807. (2021)Learning a Fourier Transform for Linear Relative Positional Encodings in Transformers., , , , , , , , , and . AISTATS, volume 238 of Proceedings of Machine Learning Research, page 2278-2286. PMLR, (2024)Can Vision Transformers Perform Convolution?, , , and . CoRR, (2021)Learning a Fourier Transform for Linear Relative Positional Encodings in Transformers., , , , , , , , , and . CoRR, (2023)Is $L^2$ Physics Informed Loss Always Suitable for Training Physics Informed Neural Network?, , , and . NeurIPS, (2022)Your Transformer May Not be as Powerful as You Expect., , , , , and . NeurIPS, (2022)