The effect of softmax temperature on recent knowledge distillation algorithms

Knowledge distillation is a technique to transfer the knowledge from a large and complex teacher model to a smaller and faster student model, and an important category among methods of model compression. In this study, I survey various knowledge distillation algorithms that have been proposed in re...

ver descrição completa

Detalhes bibliográficos
Autor principal: Poh, Dominique
Outros Autores: Weichen Liu
Formato: Final Year Project (FYP)
Idioma:English
Publicado em: Nanyang Technological University 2023
Assuntos:
Acesso em linha:https://hdl.handle.net/10356/172431