Self-knowledge distillation for first trimester ultrasound saliency prediction
Self-knowledge distillation (SKD) is a recent and promising machine learning approach where a shallow student network is trained to distill its own knowledge. By contrast, in traditional knowledge distillation a student model distills its knowledge from a large teacher network model, which involves...
主要な著者: | , , , , |
---|---|
フォーマット: | Conference item |
言語: | English |
出版事項: |
Springer
2022
|