Author of the publication

Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-Constrained Edge Computing Systems.

, , , , and . IEEE Access, (2020)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation.. CoRR, (2020)Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-Constrained Edge Computing Systems., , , , and . IEEE Access, (2020)torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP.. CoRR, (2023)Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges., , and . ACM Comput. Surv., 55 (5): 90:1-90:30 (2023)BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing., , , , and . WoWMoM, page 337-346. IEEE, (2022)Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems., , , and . EMNLP (Findings), page 7259-7272. Association for Computational Linguistics, (2022)Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems., , , , and . HotEdgeVideo@MobiCom, page 21-26. ACM, (2019)SC2 Benchmark: Supervised Compression for Split Computing., , , and . Trans. Mach. Learn. Res., (2023)COVIDLies: Detecting COVID-19 Misinformation on Social Media., , , , , and . NLP4COVID@EMNLP, Association for Computational Linguistics, (2020)Supervised Compression for Resource-Constrained Edge Computing Systems., , , and . WACV, page 923-933. IEEE, (2022)