Knowledge Distillation With Feature Self Attention

With the rapid development of deep learning technology, the size and performance of the network continuously grow, making network compression essential for commercial applications. In this paper, we propose a Feature Self Attention (FSA) module that extracts correlation information between the hidde...

Full description

Bibliographic Details
Main Authors: Sin-Gu Park, Dong-Joong Kang
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10093872/