Multiple-Stage Knowledge Distillation
Knowledge distillation (KD) is a method in which a teacher network guides the learning of a student network, thereby resulting in an improvement in the performance of the student network. Recent research in this area has concentrated on developing effective definitions of knowledge and efficient met...
Main Authors: | Chuanyun Xu, Nanlan Bai, Wenjian Gao, Tian Li, Mengwei Li, Gang Li, Yang Zhang |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-09-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/19/9453 |
Similar Items
-
Cervical Cell Image Classification-Based Knowledge Distillation
by: Wenjian Gao, et al.
Published: (2022-11-01) -
Anatomical Landmark Detection Using a Feature-Sharing Knowledge Distillation-Based Neural Network
by: Di Huang, et al.
Published: (2022-07-01) -
Autoencoder-Like Knowledge Distillation Network for Anomaly Detection
by: Caie Xu, et al.
Published: (2023-01-01) -
Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning
by: Leila Malihi, et al.
Published: (2023-09-01) -
MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement
by: Tianchi Zhang, et al.
Published: (2024-01-01)