Continual Learning With Speculative Backpropagation and Activation History
Continual learning is gaining traction these days with the explosive emergence of deep learning applications. Continual learning suffers from a severe problem called catastrophic forgetting. It means that the trained model loses the previously learned information when training with new data. This pa...
Main Authors: | Sangwoo Park, Taeweon Suh |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9754561/ |
Similar Items
-
Speculative Backpropagation for CNN Parallel Training
by: Sangwoo Park, et al.
Published: (2020-01-01) -
An Appraisal of Incremental Learning Methods
by: Yong Luo, et al.
Published: (2020-10-01) -
Is Class-Incremental Enough for Continual Learning?
by: Andrea Cossu, et al.
Published: (2022-03-01) -
Remembering for the right reasons: Explanations reduce catastrophic forgetting
by: Sayna Ebrahimi, et al.
Published: (2021-12-01) -
Controlled Forgetting: Targeted Stimulation and Dopaminergic Plasticity Modulation for Unsupervised Lifelong Learning in Spiking Neural Networks
by: Jason M. Allred, et al.
Published: (2020-01-01)