Article,

Sequence-Level Knowledge Distillation for Model Compression of Attention-based Sequence-to-Sequence Speech Recognition.

, , and .
CoRR, (2018)

Meta data

Tags

Users

  • @dblp

Comments and Reviews