EXPERIMENTAL STUDY OF SOME PROPERTIES OF KNOWLEDGE DISTILLATION
For more complex classification problems it is inevitable that we use increasingly complex and cumbersome classifying models. However, often we do not have the space or processing power to deploy these models. Knowledge distillation is an effective way to improve the accuracy of an otherwise smal...
Main Authors: | Ádám SZIJÁRTÓ, Péter LEHOTAY-KÉRY, Attila KISS |
---|---|
Format: | Article |
Language: | English |
Published: |
Babes-Bolyai University, Cluj-Napoca
2020-10-01
|
Series: | Studia Universitatis Babes-Bolyai: Series Informatica |
Subjects: | |
Online Access: | http://193.231.18.162/index.php/subbinformatica/article/view/3884 |
Similar Items
-
Quantization Robust Pruning With Knowledge Distillation
by: Jangho Kim
Published: (2023-01-01) -
Review of Recent Distillation Studies
by: Gao Minghong
Published: (2023-01-01) -
A Fine-Grained Bird Classification Method Based on Attention and Decoupled Knowledge Distillation
by: Kang Wang, et al.
Published: (2023-01-01) -
BUILDING, VISUALIZING AND EXECUTING DEEP LEARNING MODELS AS DATAFLOW GRAPHS
by: Gábor KRUPPAI, et al.
Published: (2020-08-01) -
A Novel Methodology for Measuring the Abstraction Capabilities of Image Recognition Algorithms
by: Márton Gyula Hudáky, et al.
Published: (2021-08-01)