Progressive Network Grafting With Local Features Embedding for Few-Shot Knowledge Distillation
Compared with traditional knowledge distillation, which relies on a large amount of data, few-shot knowledge distillation can distill student networks with good performance using only a small number of samples. Some recent studies treat the network as a combination of a series of network blocks, ado...
Main Author: | Weidong Du |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9934906/ |
Similar Items
-
Enhancing Few-Shot Learning in Lightweight Models via Dual-Faceted Knowledge Distillation
by: Bojun Zhou, et al.
Published: (2024-03-01) -
Global and Local Knowledge Distillation Method for Few-Shot Classification of Electrical Equipment
by: Bojun Zhou, et al.
Published: (2023-06-01) -
Few-Shot Image Classification via Mutual Distillation
by: Tianshu Zhang, et al.
Published: (2023-12-01) -
Research on a Cross-Domain Few-Shot Adaptive Classification Algorithm Based on Knowledge Distillation Technology
by: Jiuyang Gao, et al.
Published: (2024-03-01) -
PCNet: Leveraging Prototype Complementarity to Improve Prototype Affinity for Few-Shot Segmentation
by: Jing-Yu Wang, et al.
Published: (2023-12-01)