A non-negative feedback self-distillation method for salient object detection

Self-distillation methods utilize Kullback-Leibler divergence (KL) loss to transfer the knowledge from the network itself, which can improve the model performance without increasing computational resources and complexity. However, when applied to salient object detection (SOD), it is difficult to ef...

Full description

Bibliographic Details
Main Authors: Lei Chen, Tieyong Cao, Yunfei Zheng, Jibin Yang, Yang Wang, Yekui Wang, Bo Zhang
Format: Article
Language:English
Published: PeerJ Inc. 2023-06-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-1435.pdf

Similar Items