Self-knowledge distillation for first trimester ultrasound saliency prediction
Self-knowledge distillation (SKD) is a recent and promising machine learning approach where a shallow student network is trained to distill its own knowledge. By contrast, in traditional knowledge distillation a student model distills its knowledge from a large teacher network model, which involves...
Autori principali: | , , , , |
---|---|
Natura: | Conference item |
Lingua: | English |
Pubblicazione: |
Springer
2022
|
_version_ | 1826309422757969920 |
---|---|
author | Gridach, M Savochkina, E Drukker, L Papageorghiou, AT Noble, JA |
author_facet | Gridach, M Savochkina, E Drukker, L Papageorghiou, AT Noble, JA |
author_sort | Gridach, M |
collection | OXFORD |
description | Self-knowledge distillation (SKD) is a recent and promising machine learning approach where a shallow student network is trained to distill its own knowledge. By contrast, in traditional knowledge distillation a student model distills its knowledge from a large teacher network model, which involves vast computational complexity and a large storage size. Consequently, SKD is a useful approach to model medical imaging problems with scarce data. We propose an original SKD framework to predict where a sonographer should look next using a multi-modal ultrasound and gaze dataset. We design a novel Wide Feature Distillation module, which is applied to intermediate feature maps in the form of transformations. The module applies a more refined feature map filtering which is important when predicting gaze for the fetal anatomy variable in size. Our architecture design includes ReSL loss that enables a student network to learn useful information whilst discarding the rest. The proposed network is validated on a large multi-modal ultrasound dataset, which is acquired during routine first trimester fetal ultrasound scanning. Experimental results show the novel SKD approach outperforms alternative state-of-the-art architectures on all saliency metrics. |
first_indexed | 2024-03-07T07:35:28Z |
format | Conference item |
id | oxford-uuid:c88a24e0-886b-4c3c-9ad7-ff288a8f133f |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T07:35:28Z |
publishDate | 2022 |
publisher | Springer |
record_format | dspace |
spelling | oxford-uuid:c88a24e0-886b-4c3c-9ad7-ff288a8f133f2023-02-23T09:58:00ZSelf-knowledge distillation for first trimester ultrasound saliency predictionConference itemhttp://purl.org/coar/resource_type/c_5794uuid:c88a24e0-886b-4c3c-9ad7-ff288a8f133fEnglishSymplectic ElementsSpringer2022Gridach, MSavochkina, EDrukker, LPapageorghiou, ATNoble, JASelf-knowledge distillation (SKD) is a recent and promising machine learning approach where a shallow student network is trained to distill its own knowledge. By contrast, in traditional knowledge distillation a student model distills its knowledge from a large teacher network model, which involves vast computational complexity and a large storage size. Consequently, SKD is a useful approach to model medical imaging problems with scarce data. We propose an original SKD framework to predict where a sonographer should look next using a multi-modal ultrasound and gaze dataset. We design a novel Wide Feature Distillation module, which is applied to intermediate feature maps in the form of transformations. The module applies a more refined feature map filtering which is important when predicting gaze for the fetal anatomy variable in size. Our architecture design includes ReSL loss that enables a student network to learn useful information whilst discarding the rest. The proposed network is validated on a large multi-modal ultrasound dataset, which is acquired during routine first trimester fetal ultrasound scanning. Experimental results show the novel SKD approach outperforms alternative state-of-the-art architectures on all saliency metrics. |
spellingShingle | Gridach, M Savochkina, E Drukker, L Papageorghiou, AT Noble, JA Self-knowledge distillation for first trimester ultrasound saliency prediction |
title | Self-knowledge distillation for first trimester ultrasound saliency prediction |
title_full | Self-knowledge distillation for first trimester ultrasound saliency prediction |
title_fullStr | Self-knowledge distillation for first trimester ultrasound saliency prediction |
title_full_unstemmed | Self-knowledge distillation for first trimester ultrasound saliency prediction |
title_short | Self-knowledge distillation for first trimester ultrasound saliency prediction |
title_sort | self knowledge distillation for first trimester ultrasound saliency prediction |
work_keys_str_mv | AT gridachm selfknowledgedistillationforfirsttrimesterultrasoundsaliencyprediction AT savochkinae selfknowledgedistillationforfirsttrimesterultrasoundsaliencyprediction AT drukkerl selfknowledgedistillationforfirsttrimesterultrasoundsaliencyprediction AT papageorghiouat selfknowledgedistillationforfirsttrimesterultrasoundsaliencyprediction AT nobleja selfknowledgedistillationforfirsttrimesterultrasoundsaliencyprediction |