Knowledge distillation in deep learning and its applications

Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger...

Full description

Bibliographic Details
Main Authors: Abdolmaged Alkhulaifi, Fahad Alsahli, Irfan Ahmad
Format: Article
Language:English
Published: PeerJ Inc. 2021-04-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-474.pdf