Preventing catastrophic forgetting in continual learning

Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural netwo...

Full description

Bibliographic Details
Main Author: Ong, Yi Shen
Other Authors: Lin Guosheng
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/162924
Description
Summary:Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used.