Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition
Knowledge distillation, which is a process of transferring complex knowledge learned by a heavy network, i.e., a teacher, to a lightweight network, i.e., a student, has emerged as an effective technique for compressing neural networks. To reduce the necessity of training a large teacher network, thi...
Main Authors: | Duc-Quang Vu, Ngan Le, Jia-Ching Wang |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9495804/ |
Similar Items
-
Focal Channel Knowledge Distillation for Multi-Modality Action Recognition
by: Lipeng Gan, et al.
Published: (2023-01-01) -
3D network with channel excitation and knowledge distillation for action recognition
by: Zhengping Hu, et al.
Published: (2023-03-01) -
Knowledge Distillation in Video-Based Human Action Recognition: An Intuitive Approach to Efficient and Flexible Model Training
by: Fernando Camarena, et al.
Published: (2024-03-01) -
A survey on knowledge distillation: Recent advancements
by: Amir Moslemi, et al.
Published: (2024-12-01) -
Privacy-Safe Action Recognition via Cross-Modality Distillation
by: Yuhyun Kim, et al.
Published: (2024-01-01)