Compressing medical deep neural network models for edge devices using knowledge distillation
Recently, deep neural networks (DNNs) have been used successfully in many fields, particularly, in medical diagnosis. However, deep learning (DL) models are expensive in terms of memory and computing resources, which hinders their implementation in limited-resources devices or for delay-sensitive sy...
Principais autores: | F. MohiEldeen Alabbasy, A.S. Abohamama, Mohammed F. Alrahmawy |
---|---|
Formato: | Artigo |
Idioma: | English |
Publicado em: |
Elsevier
2023-07-01
|
coleção: | Journal of King Saud University: Computer and Information Sciences |
Assuntos: | |
Acesso em linha: | http://www.sciencedirect.com/science/article/pii/S1319157823001702 |
Registros relacionados
-
A survey on knowledge distillation: Recent advancements
por: Amir Moslemi, et al.
Publicado em: (2024-12-01) -
Automated Arrhythmia Classification System: Proof-of-Concept With Lightweight Model on an Ultra-Edge Device
por: Namho Kim, et al.
Publicado em: (2024-01-01) -
Compressing recognition network of cotton disease with spot-adaptive knowledge distillation
por: Xinwen Zhang, et al.
Publicado em: (2024-09-01) -
Heterogeneous Knowledge Distillation Using Conceptual Learning
por: Yerin Yu, et al.
Publicado em: (2024-01-01) -
Instance-Level Scaling and Dynamic Margin-Alignment Knowledge Distillation for Remote Sensing Image Scene Classification
por: Chuan Li, et al.
Publicado em: (2024-10-01)