Teaching Yourself: A Self-Knowledge Distillation Approach to Action Recognition

Knowledge distillation, which is a process of transferring complex knowledge learned by a heavy network, i.e., a teacher, to a lightweight network, i.e., a student, has emerged as an effective technique for compressing neural networks. To reduce the necessity of training a large teacher network, thi...

Full description

Bibliographic Details
Main Authors: Duc-Quang Vu, Ngan Le, Jia-Ching Wang
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9495804/