Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks
Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming d...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-09-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/23/19/8117 |
_version_ | 1827722172638953472 |
---|---|
author | Jonghong Kim WonHee Lee Sungdae Baek Jeong-Ho Hong Minho Lee |
author_facet | Jonghong Kim WonHee Lee Sungdae Baek Jeong-Ho Hong Minho Lee |
author_sort | Jonghong Kim |
collection | DOAJ |
description | Catastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming data in an online manner. We develop a new incremental learning framework that can learn extra data or new classes with less catastrophic forgetting. We adopt the hippocampal memory process to the deep neural networks by defining the effective maximum of neural activation and its boundary to represent a feature distribution. In addition, we incorporate incremental QR factorization into the deep neural networks to learn new data with both existing labels and new labels with less forgetting. The QR factorization can provide the accurate subspace prior, and incremental QR factorization can reasonably express the collaboration between new data with both existing classes and new class with less forgetting. In our framework, a set of appropriate features (i.e., nodes) provides improved representation for each class. We apply our method to the convolutional neural network (CNN) for learning Cifar-100 and Cifar-10 datasets. The experimental results show that the proposed method efficiently alleviates the stability and plasticity dilemma in the deep neural networks by providing the performance stability of a trained network while effectively learning unseen data and additional new classes. |
first_indexed | 2024-03-10T21:35:14Z |
format | Article |
id | doaj.art-03c4de3ce1a44e2bb9d526e797dcab56 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T21:35:14Z |
publishDate | 2023-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-03c4de3ce1a44e2bb9d526e797dcab562023-11-19T15:02:56ZengMDPI AGSensors1424-82202023-09-012319811710.3390/s23198117Incremental Learning for Online Data Using QR Factorization on Convolutional Neural NetworksJonghong Kim0WonHee Lee1Sungdae Baek2Jeong-Ho Hong3Minho Lee4Department of Neurology, Keimyung University Dongsan Hospital, Keimyung University School of Medicine, Daegu 42601, Republic of KoreaDepartment of Neurology, Keimyung University Dongsan Hospital, Keimyung University School of Medicine, Daegu 42601, Republic of KoreaGraduate School of Artificial Intelligence, Kyungpook National University, Daegu 41566, Republic of KoreaDepartment of Neurology, Keimyung University Dongsan Hospital, Keimyung University School of Medicine, Daegu 42601, Republic of KoreaGraduate School of Artificial Intelligence, Kyungpook National University, Daegu 41566, Republic of KoreaCatastrophic forgetting, which means a rapid forgetting of learned representations while learning new data/samples, is one of the main problems of deep neural networks. In this paper, we propose a novel incremental learning framework that can address the forgetting problem by learning new incoming data in an online manner. We develop a new incremental learning framework that can learn extra data or new classes with less catastrophic forgetting. We adopt the hippocampal memory process to the deep neural networks by defining the effective maximum of neural activation and its boundary to represent a feature distribution. In addition, we incorporate incremental QR factorization into the deep neural networks to learn new data with both existing labels and new labels with less forgetting. The QR factorization can provide the accurate subspace prior, and incremental QR factorization can reasonably express the collaboration between new data with both existing classes and new class with less forgetting. In our framework, a set of appropriate features (i.e., nodes) provides improved representation for each class. We apply our method to the convolutional neural network (CNN) for learning Cifar-100 and Cifar-10 datasets. The experimental results show that the proposed method efficiently alleviates the stability and plasticity dilemma in the deep neural networks by providing the performance stability of a trained network while effectively learning unseen data and additional new classes.https://www.mdpi.com/1424-8220/23/19/8117image processingincremental learningconvolutional neural networkdeep learningartificial intelligencecompressed sensing |
spellingShingle | Jonghong Kim WonHee Lee Sungdae Baek Jeong-Ho Hong Minho Lee Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks Sensors image processing incremental learning convolutional neural network deep learning artificial intelligence compressed sensing |
title | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_full | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_fullStr | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_full_unstemmed | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_short | Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks |
title_sort | incremental learning for online data using qr factorization on convolutional neural networks |
topic | image processing incremental learning convolutional neural network deep learning artificial intelligence compressed sensing |
url | https://www.mdpi.com/1424-8220/23/19/8117 |
work_keys_str_mv | AT jonghongkim incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT wonheelee incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT sungdaebaek incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT jeonghohong incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks AT minholee incrementallearningforonlinedatausingqrfactorizationonconvolutionalneuralnetworks |