Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks
Continual learning is the ability to acquire a new task or knowledge without losing any previously collected information. Achieving continual learning in artificial intelligence (AI) is currently prevented by catastrophic forgetting, where training of a new task deletes all previously learned tasks....
Main Authors: | Irene Munoz-Martin, Stefano Bianchi, Giacomo Pedretti, Octavian Melnic, Stefano Ambrogio, Daniele Ielmini |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Journal on Exploratory Solid-State Computational Devices and Circuits |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8693665/ |
Similar Items
-
Controlled Forgetting: Targeted Stimulation and Dopaminergic Plasticity Modulation for Unsupervised Lifelong Learning in Spiking Neural Networks
by: Jason M. Allred, et al.
Published: (2020-01-01) -
Natural Way to Overcome Catastrophic Forgetting in Neural Networks
by: Alexey Kutalev
Published: (2020-09-01) -
Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks
by: Stefano Brivio, et al.
Published: (2021-02-01) -
Remembering for the right reasons: Explanations reduce catastrophic forgetting
by: Sayna Ebrahimi, et al.
Published: (2021-12-01) -
Mitigating Catastrophic Forgetting with Complementary Layered Learning
by: Sean Mondesire, et al.
Published: (2023-01-01)