An Efficient Strategy for Catastrophic Forgetting Reduction in Incremental Learning
Deep neural networks (DNNs) have made outstanding achievements in a wide variety of domains. For deep learning tasks, large enough datasets are required for training efficient DNN models. However, big datasets are not always available, and they are costly to build. Therefore, balanced solutions for...
Main Authors: | Huong-Giang Doan, Hong-Quan Luong, Thi-Oanh Ha, Thi Thanh Thuy Pham |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-05-01
|
Series: | Electronics |
Subjects: | |
Online Access: | https://www.mdpi.com/2079-9292/12/10/2265 |
Similar Items
-
Can sleep protect memories from catastrophic forgetting?
by: Oscar C González, et al.
Published: (2020-08-01) -
Remembering for the right reasons: Explanations reduce catastrophic forgetting
by: Sayna Ebrahimi, et al.
Published: (2021-12-01) -
An exhaustive survey of regular peptide conformations using a new metric for backbone handedness (h)
by: Ranjan V. Mannige
Published: (2017-05-01) -
Catastrophic Forgetting in Deep Graph Networks: A Graph Classification Benchmark
by: Antonio Carta, et al.
Published: (2022-02-01) -
IG-YOLOv5-based underwater biological recognition and detection for marine protection
by: Huo Jialu, et al.
Published: (2023-12-01)