Knowledge distillation in deep learning and its applications
Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger...
Main Authors: | Abdolmaged Alkhulaifi, Fahad Alsahli, Irfan Ahmad |
---|---|
Format: | Article |
Language: | English |
Published: |
PeerJ Inc.
2021-04-01
|
Series: | PeerJ Computer Science |
Subjects: | |
Online Access: | https://peerj.com/articles/cs-474.pdf |
Similar Items
-
Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning
by: Leila Malihi, et al.
Published: (2023-09-01) -
Analysis of Model Compression Using Knowledge Distillation
by: Yu-Wei Hong, et al.
Published: (2022-01-01) -
Compressing medical deep neural network models for edge devices using knowledge distillation
by: F. MohiEldeen Alabbasy, et al.
Published: (2023-07-01) -
Heterogeneous Knowledge Distillation Using Conceptual Learning
by: Yerin Yu, et al.
Published: (2024-01-01) -
Deep Generative Knowledge Distillation by Likelihood Finetuning
by: Jingru Li, et al.
Published: (2023-01-01)