A non-negative feedback self-distillation method for salient object detection
Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient object detection (SOD), it is difficult to ef...
Main Authors: | Lei Chen, Tieyong Cao, Yunfei Zheng, Jibin Yang, Yang Wang, Yekui Wang, Bo Zhang |
---|---|
Format: | Article |
Language: | English |
Published: |
PeerJ Inc.
2023-06-01
|
Series: | PeerJ Computer Science |
Subjects: | |
Online Access: | https://peerj.com/articles/cs-1435.pdf |
Similar Items
-
Kullback–Leibler Divergence of Sleep-Wake Patterns Related with Depressive Severity in Patients with Epilepsy
by: Mingsu Liu, et al.
Published: (2023-05-01) -
Dynamic Knowledge Distillation with Noise Elimination for RGB-D Salient Object Detection
by: Guangyu Ren, et al.
Published: (2022-08-01) -
Rotating Object Detection for Cranes in Transmission Line Scenarios
by: Lingzhi Xia, et al.
Published: (2023-12-01) -
Article Omission in Dutch Children with SLI: A Processing Approach
by: Lizet van Ewijk, et al.
Published: (2010-04-01) -
Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
by: Marco Scutari
Published: (2024-01-01)