Review of Recent Distillation Studies

Knowledge distillation has gained a lot of interest in recent years because it allows for compressing a large deep neural network (teacher DNN) into a smaller DNN (student DNN), while maintaining its accuracy. Recent improvements have been made to knowledge distillation. One such improvement is the...

Full description

Bibliographic Details
Main Author: Gao Minghong
Format: Article
Language:English
Published: EDP Sciences 2023-01-01
Series:MATEC Web of Conferences
Subjects:
Online Access:https://www.matec-conferences.org/articles/matecconf/pdf/2023/09/matecconf_amme2023_01034.pdf