Inproceedings,

Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher.

, and .
NeurIPS, (2020)

Meta data

Tags

Users

  • @dblp

Comments and Reviews