Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation
In recent computer vision research, the pursuit of improved classification performance often leads to the adoption of complex, large-scale models. However, the actual deployment of such extensive models poses significant challenges in environments constrained by limited computing power and storage c...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-03-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/24/6/1815 |
_version_ | 1797239382340534272 |
---|---|
author | Bojun Zhou Tianyu Cheng Jiahao Zhao Chunkai Yan Ling Jiang Xinsong Zhang Juping Gu |
author_facet | Bojun Zhou Tianyu Cheng Jiahao Zhao Chunkai Yan Ling Jiang Xinsong Zhang Juping Gu |
author_sort | Bojun Zhou |
collection | DOAJ |
description | In recent computer vision research, the pursuit of improved classification performance often leads to the adoption of complex, large-scale models. However, the actual deployment of such extensive models poses significant challenges in environments constrained by limited computing power and storage capacity. Consequently, this study is dedicated to addressing these challenges by focusing on innovative methods that enhance the classification performance of lightweight models. We propose a novel method to compress the knowledge learned by a large model into a lightweight one so that the latter can also achieve good performance in few-shot classification tasks. Specifically, we propose a dual-faceted knowledge distillation strategy that combines output-based and intermediate feature-based methods. The output-based method concentrates on distilling knowledge related to base class labels, while the intermediate feature-based approach, augmented by feature error distribution calibration, tackles the potential non-Gaussian nature of feature deviations, thereby boosting the effectiveness of knowledge transfer. Experiments conducted on MiniImageNet, CIFAR-FS, and CUB datasets demonstrate the superior performance of our method over state-of-the-art lightweight models, particularly in five-way one-shot and five-way five-shot tasks. |
first_indexed | 2024-04-24T17:50:39Z |
format | Article |
id | doaj.art-8156535de34746cda9a79d1207b05158 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-04-24T17:50:39Z |
publishDate | 2024-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-8156535de34746cda9a79d1207b051582024-03-27T14:03:51ZengMDPI AGSensors1424-82202024-03-01246181510.3390/s24061815Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge DistillationBojun Zhou0Tianyu Cheng1Jiahao Zhao2Chunkai Yan3Ling Jiang4Xinsong Zhang5Juping Gu6School of Information Science and Technology, Nantong University, Nantong 226019, ChinaSchool of Electrical Engineering, Nantong University, Nantong 226019, ChinaSchool of Information Science and Technology, Nantong University, Nantong 226019, ChinaSchool of Information Science and Technology, Nantong University, Nantong 226019, ChinaSchool of Information Science and Technology, Nantong University, Nantong 226019, ChinaSchool of Electrical Engineering, Nantong University, Nantong 226019, ChinaSchool of Information Science and Technology, Nantong University, Nantong 226019, ChinaIn recent computer vision research, the pursuit of improved classification performance often leads to the adoption of complex, large-scale models. However, the actual deployment of such extensive models poses significant challenges in environments constrained by limited computing power and storage capacity. Consequently, this study is dedicated to addressing these challenges by focusing on innovative methods that enhance the classification performance of lightweight models. We propose a novel method to compress the knowledge learned by a large model into a lightweight one so that the latter can also achieve good performance in few-shot classification tasks. Specifically, we propose a dual-faceted knowledge distillation strategy that combines output-based and intermediate feature-based methods. The output-based method concentrates on distilling knowledge related to base class labels, while the intermediate feature-based approach, augmented by feature error distribution calibration, tackles the potential non-Gaussian nature of feature deviations, thereby boosting the effectiveness of knowledge transfer. Experiments conducted on MiniImageNet, CIFAR-FS, and CUB datasets demonstrate the superior performance of our method over state-of-the-art lightweight models, particularly in five-way one-shot and five-way five-shot tasks.https://www.mdpi.com/1424-8220/24/6/1815few-shot classificationknowledge distillationmodel compressiondistribution calibration |
spellingShingle | Bojun Zhou Tianyu Cheng Jiahao Zhao Chunkai Yan Ling Jiang Xinsong Zhang Juping Gu Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation Sensors few-shot classification knowledge distillation model compression distribution calibration |
title | Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation |
title_full | Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation |
title_fullStr | Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation |
title_full_unstemmed | Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation |
title_short | Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation |
title_sort | enhancing few shot learning in lightweight models via dual faceted knowledge distillation |
topic | few-shot classification knowledge distillation model compression distribution calibration |
url | https://www.mdpi.com/1424-8220/24/6/1815 |
work_keys_str_mv | AT bojunzhou enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT tianyucheng enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT jiahaozhao enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT chunkaiyan enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT lingjiang enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT xinsongzhang enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation AT jupinggu enhancingfewshotlearninginlightweightmodelsviadualfacetedknowledgedistillation |