Decoupled knowledge distillation method based on meta-learning
With the advancement of deep learning techniques, the number of model parameters has been increasing, leading to significant memory consumption and limits in the deployment of such models in real-time applications. To reduce the number of model parameters and enhance the generalization capability of...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2024-03-01
|
Series: | High-Confidence Computing |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2667295223000624 |