Inproceedings,

Attention is not all you need: pure attention loses rank doubly exponentially with depth.

, , and .
ICML, volume 139 of Proceedings of Machine Learning Research, page 2793-2803. PMLR, (2021)

Meta data

Tags

Users

  • @analyst
  • @dblp

Comments and Reviews