Preventing catastrophic forgetting in continual learning

Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural netwo...

Full description

Bibliographic Details
Main Author: Ong, Yi Shen
Other Authors: Lin Guosheng
Format: Final Year Project (FYP)
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/162924
_version_ 1811679910121439232
author Ong, Yi Shen
author2 Lin Guosheng
author_facet Lin Guosheng
Ong, Yi Shen
author_sort Ong, Yi Shen
collection NTU
description Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used.
first_indexed 2024-10-01T03:16:39Z
format Final Year Project (FYP)
id ntu-10356/162924
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:16:39Z
publishDate 2022
publisher Nanyang Technological University
record_format dspace
spelling ntu-10356/1629242022-11-14T03:52:28Z Preventing catastrophic forgetting in continual learning Ong, Yi Shen Lin Guosheng School of Computer Science and Engineering gslin@ntu.edu.sg Engineering::Computer science and engineering Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used. Bachelor of Engineering (Computer Science) 2022-11-14T03:52:28Z 2022-11-14T03:52:28Z 2022 Final Year Project (FYP) Ong, Y. S. (2022). Preventing catastrophic forgetting in continual learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162924 https://hdl.handle.net/10356/162924 en SCSE21-0626 application/pdf Nanyang Technological University
spellingShingle Engineering::Computer science and engineering
Ong, Yi Shen
Preventing catastrophic forgetting in continual learning
title Preventing catastrophic forgetting in continual learning
title_full Preventing catastrophic forgetting in continual learning
title_fullStr Preventing catastrophic forgetting in continual learning
title_full_unstemmed Preventing catastrophic forgetting in continual learning
title_short Preventing catastrophic forgetting in continual learning
title_sort preventing catastrophic forgetting in continual learning
topic Engineering::Computer science and engineering
url https://hdl.handle.net/10356/162924
work_keys_str_mv AT ongyishen preventingcatastrophicforgettingincontinuallearning