Inproceedings,

Extremely Small BERT Models from Mixed-Vocabulary Training.

, , , and .
EACL, page 2753-2759. Association for Computational Linguistics, (2021)

Meta data

Tags

Users

  • @dblp

Comments and Reviews