Inproceedings,

Gradient-based Intra-attention Pruning on Pre-trained Language Models.

, , , and .
ACL (1), page 2775-2790. Association for Computational Linguistics, (2023)

Meta data

Tags

Users

  • @dblp

Comments and Reviews