Effective Online Knowledge Distillation via Attention-Based Model Ensembling

Large-scale deep learning models have achieved impressive results on a variety of tasks; however, their deployment on edge or mobile devices is still a challenge due to the limited available memory and computational capability. Knowledge distillation is an effective model compression technique, whic...

Full description

Bibliographic Details
Main Authors: Diana-Laura Borza, Adrian Sergiu Darabant, Tudor Alexandru Ileni, Alexandru-Ion Marinescu
Format: Article
Language:English
Published: MDPI AG 2022-11-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/22/4285