Adaptive Threshold Hierarchical Incremental Learning Method

Traditional deep convolutional neural networks have achieved excellent performance on various machine learning tasks. Still, they perform poorly in continuous data stream environments, where models trained on new datasets often suffer from a significant drop in performance on old datasets, a phenome...

Full description

Bibliographic Details
Main Authors: Xingyu Li, Shengbo Dong, Qiya Su, Muyao Yu, Xinzhi Li
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10038576/
_version_ 1811163518176591872
author Xingyu Li
Shengbo Dong
Qiya Su
Muyao Yu
Xinzhi Li
author_facet Xingyu Li
Shengbo Dong
Qiya Su
Muyao Yu
Xinzhi Li
author_sort Xingyu Li
collection DOAJ
description Traditional deep convolutional neural networks have achieved excellent performance on various machine learning tasks. Still, they perform poorly in continuous data stream environments, where models trained on new datasets often suffer from a significant drop in performance on old datasets, a phenomenon known as “catastrophic forgetting.” Incremental learning can help solve the “catastrophic forgetting” problem in deep learning by learning new knowledge while retaining what has already been learned. In practice, incremental learning algorithms usually need to be deployed on edge devices with limited memory and restricted access to training data, facing the problems of high model complexity and imbalance between old and new categories of data. We propose an Adaptive Threshold Hierarchical Incremental Learning (ATHIL) method to address the above problems. Our proposed method does not require additional data and model storage space during the training process, combines local weight discrete coefficient thresholding and the mean nearest neighbor principle, uses a sparse matrix hierarchical masking network, and flexibly adjusts the network structure according to different tasks to achieve learning multiple image classification tasks in a single network. The experimental results show that the performance of the proposed method significantly outperforms existing methods on fine-grained classification datasets under three evaluation metrics.
first_indexed 2024-04-10T15:07:35Z
format Article
id doaj.art-8416e5e5a7004c58b1ac3387d07e2512
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-10T15:07:35Z
publishDate 2023-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-8416e5e5a7004c58b1ac3387d07e25122023-02-15T00:00:16ZengIEEEIEEE Access2169-35362023-01-0111122851229310.1109/ACCESS.2023.324268810038576Adaptive Threshold Hierarchical Incremental Learning MethodXingyu Li0https://orcid.org/0000-0002-8464-0106Shengbo Dong1Qiya Su2Muyao Yu3Xinzhi Li4Beijing Institute of Remote Sensing Equipment, Beijing, ChinaBeijing Institute of Remote Sensing Equipment, Beijing, ChinaBeijing Institute of Remote Sensing Equipment, Beijing, ChinaBeijing Institute of Remote Sensing Equipment, Beijing, ChinaBeijing Institute of Remote Sensing Equipment, Beijing, ChinaTraditional deep convolutional neural networks have achieved excellent performance on various machine learning tasks. Still, they perform poorly in continuous data stream environments, where models trained on new datasets often suffer from a significant drop in performance on old datasets, a phenomenon known as “catastrophic forgetting.” Incremental learning can help solve the “catastrophic forgetting” problem in deep learning by learning new knowledge while retaining what has already been learned. In practice, incremental learning algorithms usually need to be deployed on edge devices with limited memory and restricted access to training data, facing the problems of high model complexity and imbalance between old and new categories of data. We propose an Adaptive Threshold Hierarchical Incremental Learning (ATHIL) method to address the above problems. Our proposed method does not require additional data and model storage space during the training process, combines local weight discrete coefficient thresholding and the mean nearest neighbor principle, uses a sparse matrix hierarchical masking network, and flexibly adjusts the network structure according to different tasks to achieve learning multiple image classification tasks in a single network. The experimental results show that the performance of the proposed method significantly outperforms existing methods on fine-grained classification datasets under three evaluation metrics.https://ieeexplore.ieee.org/document/10038576/Incremental learningconvolutional neural networkdeep learning
spellingShingle Xingyu Li
Shengbo Dong
Qiya Su
Muyao Yu
Xinzhi Li
Adaptive Threshold Hierarchical Incremental Learning Method
IEEE Access
Incremental learning
convolutional neural network
deep learning
title Adaptive Threshold Hierarchical Incremental Learning Method
title_full Adaptive Threshold Hierarchical Incremental Learning Method
title_fullStr Adaptive Threshold Hierarchical Incremental Learning Method
title_full_unstemmed Adaptive Threshold Hierarchical Incremental Learning Method
title_short Adaptive Threshold Hierarchical Incremental Learning Method
title_sort adaptive threshold hierarchical incremental learning method
topic Incremental learning
convolutional neural network
deep learning
url https://ieeexplore.ieee.org/document/10038576/
work_keys_str_mv AT xingyuli adaptivethresholdhierarchicalincrementallearningmethod
AT shengbodong adaptivethresholdhierarchicalincrementallearningmethod
AT qiyasu adaptivethresholdhierarchicalincrementallearningmethod
AT muyaoyu adaptivethresholdhierarchicalincrementallearningmethod
AT xinzhili adaptivethresholdhierarchicalincrementallearningmethod