The effect of softmax temperature on recent knowledge distillation algorithms

Knowledge distillation is a technique to transfer the knowledge from a large and complex teacher model to a smaller and faster student model, and an important category among methods of model compression. In this study, I survey various knowledge distillation algorithms that have been proposed in re...

Full description

Bibliographic Details
Main Author: Poh, Dominique
Other Authors: Weichen Liu
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172431