LGFA-MTKD: Enhancing Multi-Teacher Knowledge Distillation with Local and Global Frequency Attention

Transferring the extensive and varied knowledge contained within multiple complex models into a more compact student model poses significant challenges in multi-teacher knowledge distillation. Traditional distillation approaches often fall short in this context, as they struggle to fully capture and...

Full description

Bibliographic Details
Main Authors: Xin Cheng, Jinjia Zhou
Format: Article
Language:English
Published: MDPI AG 2024-11-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/15/11/735