Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation

Restricted by the cost of generating labels for training, semi-supervised methods have been applied to semantic segmentation tasks and have achieved varying degrees of success. Recently, the semi-supervised learning method has taken pseudo supervision as the core idea, especially self-training metho...

Full description

Bibliographic Details
Main Authors: Yan Zhou, Ruyi Jiao, Dongli Wang, Jinzhen Mu, Jianxun Li
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9768798/
_version_ 1818552960424083456
author Yan Zhou
Ruyi Jiao
Dongli Wang
Jinzhen Mu
Jianxun Li
author_facet Yan Zhou
Ruyi Jiao
Dongli Wang
Jinzhen Mu
Jianxun Li
author_sort Yan Zhou
collection DOAJ
description Restricted by the cost of generating labels for training, semi-supervised methods have been applied to semantic segmentation tasks and have achieved varying degrees of success. Recently, the semi-supervised learning method has taken pseudo supervision as the core idea, especially self-training methods that generate pseudo labels. However, pseudo labels are noisy. In semi-supervised learning, as training progresses, the model needs to focus on more semantic classes and bias towards the newly learned classes. Moreover, due to the limitation of the amount of labeled data, it is difficult for the model to &#x201C;stabilize&#x201D; the learned knowledge. That raise the issue of the model forgetting previously learned knowledge. Based on this new view, we point out that alleviating &#x201C;catastrophic forgetting&#x201D; of the model is beneficial for enhancing the quality of pseudo labels, and propose a pseudo label enhancement strategy. In this strategy, the pseudo labels generated by the previous model are used to rehearse the previous knowledge. Additionally, conflict reduction is proposed to resolve the conflicts of pseudo labels generated from both the previous and current models. We evaluate our scheme on two general semi-supervised semantic segmentation benchmarks, and both achieve state-of-the-art performance. Our codes are released at <uri>https://github.com/wing212/DMT-PLE</uri>.
first_indexed 2024-12-12T09:19:52Z
format Article
id doaj.art-69095fad4c4b48e58d20705f5f4beba9
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-12T09:19:52Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-69095fad4c4b48e58d20705f5f4beba92022-12-22T00:29:15ZengIEEEIEEE Access2169-35362022-01-0110488554886410.1109/ACCESS.2022.31726649768798Catastrophic Forgetting Problem in Semi-Supervised Semantic SegmentationYan Zhou0https://orcid.org/0000-0002-2372-4947Ruyi Jiao1Dongli Wang2Jinzhen Mu3Jianxun Li4School of Automation and Electronic Information, Xiangtan University, Xiangtan, ChinaSchool of Automation and Electronic Information, Xiangtan University, Xiangtan, ChinaSchool of Automation and Electronic Information, Xiangtan University, Xiangtan, ChinaShanghai Aerospace Control Technology Institute, Shanghai, ChinaSchool of Electronics and Information Technology, Shanghai Jiao Tong University, Shanghai, ChinaRestricted by the cost of generating labels for training, semi-supervised methods have been applied to semantic segmentation tasks and have achieved varying degrees of success. Recently, the semi-supervised learning method has taken pseudo supervision as the core idea, especially self-training methods that generate pseudo labels. However, pseudo labels are noisy. In semi-supervised learning, as training progresses, the model needs to focus on more semantic classes and bias towards the newly learned classes. Moreover, due to the limitation of the amount of labeled data, it is difficult for the model to &#x201C;stabilize&#x201D; the learned knowledge. That raise the issue of the model forgetting previously learned knowledge. Based on this new view, we point out that alleviating &#x201C;catastrophic forgetting&#x201D; of the model is beneficial for enhancing the quality of pseudo labels, and propose a pseudo label enhancement strategy. In this strategy, the pseudo labels generated by the previous model are used to rehearse the previous knowledge. Additionally, conflict reduction is proposed to resolve the conflicts of pseudo labels generated from both the previous and current models. We evaluate our scheme on two general semi-supervised semantic segmentation benchmarks, and both achieve state-of-the-art performance. Our codes are released at <uri>https://github.com/wing212/DMT-PLE</uri>.https://ieeexplore.ieee.org/document/9768798/Catastrophic forgetting problemnoisy pseudo labelpseudo label enhancement strategysemi-supervised semantic segmentation
spellingShingle Yan Zhou
Ruyi Jiao
Dongli Wang
Jinzhen Mu
Jianxun Li
Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
IEEE Access
Catastrophic forgetting problem
noisy pseudo label
pseudo label enhancement strategy
semi-supervised semantic segmentation
title Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
title_full Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
title_fullStr Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
title_full_unstemmed Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
title_short Catastrophic Forgetting Problem in Semi-Supervised Semantic Segmentation
title_sort catastrophic forgetting problem in semi supervised semantic segmentation
topic Catastrophic forgetting problem
noisy pseudo label
pseudo label enhancement strategy
semi-supervised semantic segmentation
url https://ieeexplore.ieee.org/document/9768798/
work_keys_str_mv AT yanzhou catastrophicforgettingprobleminsemisupervisedsemanticsegmentation
AT ruyijiao catastrophicforgettingprobleminsemisupervisedsemanticsegmentation
AT dongliwang catastrophicforgettingprobleminsemisupervisedsemanticsegmentation
AT jinzhenmu catastrophicforgettingprobleminsemisupervisedsemanticsegmentation
AT jianxunli catastrophicforgettingprobleminsemisupervisedsemanticsegmentation