Author of the publication

Correlation Guided Multi-teacher Knowledge Distillation.

, , , and . ICONIP (4), volume 14450 of Lecture Notes in Computer Science, page 562-574. Springer, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

The Detection of Attentive Mental State Using a Mixed Neural Network Model., , , , , and . ISCAS, page 1-5. IEEE, (2021)Knowledge Distillation via Information Matching., , , and . ICONIP (4), volume 14450 of Lecture Notes in Computer Science, page 405-417. Springer, (2023)Correlation Guided Multi-teacher Knowledge Distillation., , , and . ICONIP (4), volume 14450 of Lecture Notes in Computer Science, page 562-574. Springer, (2023)Dy-KD: Dynamic Knowledge Distillation for Reduced Easy Examples., , , , and . ICONIP (12), volume 1966 of Communications in Computer and Information Science, page 223-234. Springer, (2023)Joint Regularization Knowledge Distillation., , , , and . ICONIP (12), volume 1966 of Communications in Computer and Information Science, page 173-184. Springer, (2023)Feature Reconstruction Distillation with Self-attention., , , and . ICONIP (12), volume 1966 of Communications in Computer and Information Science, page 246-257. Springer, (2023)Optimizing Knowledge Distillation via Shallow Texture Knowledge Transfer., , , , , , and . ICONIP (4), volume 1791 of Communications in Computer and Information Science, page 647-658. Springer, (2022)Direct Distillation between Different Domains., , , , , , and . CoRR, (2024)ClearKD: Clear Knowledge Distillation for Medical Image Classification., , and . IJCNN, page 1-8. IEEE, (2024)Triplet Knowledge Distillation Networks for Model Compression., , , and . IJCNN, page 1-8. IEEE, (2021)