Feature fusion-based collaborative learning for knowledge distillation
Deep neural networks have achieved a great success in a variety of applications, such as self-driving cars and intelligent robotics. Meanwhile, knowledge distillation has received increasing attention as an effective model compression technique for training very efficient deep models. The performanc...
Main Authors: | Yiting Li, Liyuan Sun, Jianping Gou, Lan Du, Weihua Ou |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-11-01
|
Series: | International Journal of Distributed Sensor Networks |
Online Access: | https://doi.org/10.1177/15501477211057037 |
Similar Items
-
Multistage feature fusion knowledge distillation
by: Gang Li, et al.
Published: (2024-06-01) -
Knowledge distillation based on multi-layer fusion features.
by: Shengyuan Tan, et al.
Published: (2023-01-01) -
Preface to the Special Issue “Advancement of Mathematical Methods in Feature Representation Learning for Artificial Intelligence, Data Mining and Robotics”—Special Issue Book
by: Weihua Ou, et al.
Published: (2023-02-01) -
From sensor fusion to knowledge distillation in collaborative LIBS and hyperspectral imaging for mineral identification
by: Tomás Lopes, et al.
Published: (2024-04-01) -
Beyond Knowledge Distillation: Collaborative Learning for Bidirectional Model Assistance
by: Jinzhuo Wang, et al.
Published: (2018-01-01)