Inproceedings,

Scalable Syntax-Aware Language Models Using Knowledge Distillation.

, , , , and .
ACL (1), page 3472-3484. Association for Computational Linguistics, (2019)

Meta data

Tags

Users

  • @dblp

Comments and Reviews