Mutual Learning Knowledge Distillation Based on Multi-stage Multi-generative Adversarial Network
Aiming at the problems of insufficient knowledge distillation efficiency,single stage training methods,complex training processes and difficult convergence of traditional knowledge distillation methods in image classification tasks,this paper designs a mutual learning knowledge distillation based on...
Main Author: | HUANG Zhong-hao, YANG Xing-yao, YU Jiong, GUO Liang, LI Xiang |
---|---|
Format: | Article |
Language: | zho |
Published: |
Editorial office of Computer Science
2022-10-01
|
Series: | Jisuanji kexue |
Subjects: | |
Online Access: | https://www.jsjkx.com/fileup/1002-137X/PDF/1002-137X-2022-49-10-169.pdf |
Similar Items
-
Improving Adversarial Robustness via Distillation-Based Purification
by: Inhwa Koo, et al.
Published: (2023-10-01) -
Improving Deep Mutual Learning via Knowledge Distillation
by: Achmad Lukman, et al.
Published: (2022-08-01) -
A Virtual Knowledge Distillation via Conditional GAN
by: Sihwan Kim
Published: (2022-01-01) -
Adversarial Optimization-Based Knowledge Transfer of Layer-Wise Dense Flow for Image Classification
by: Doyeob Yeo, et al.
Published: (2021-04-01) -
GAN-Knowledge Distillation for One-Stage Object Detection
by: Wanwei Wang, et al.
Published: (2020-01-01)