Learning and memorization via predictive coding

Neural networks trained with backpropagation achieved impressive results in the last decade. However, training such models requires sequential backward updates and non-local computations, making it challenging to parallelize at scale, implement in novel hardware, and is unlike how learning works in...

Full description

Bibliographic Details
Main Author: Salvatori, T
Other Authors: Lukasiewicz, T
Format: Thesis
Language:English
Published: 2022
Subjects:
_version_ 1797111103226904576
author Salvatori, T
author2 Lukasiewicz, T
author_facet Lukasiewicz, T
Salvatori, T
author_sort Salvatori, T
collection OXFORD
description Neural networks trained with backpropagation achieved impressive results in the last decade. However, training such models requires sequential backward updates and non-local computations, making it challenging to parallelize at scale, implement in novel hardware, and is unlike how learning works in the brain. Neuroscience-inspired learning algorithms, such as predictive coding, have the potential to overcome these limitations and advance beyond current deep learning technologies. This potential, however, has only recently gained the attention of the community. As a consequence, the properties of these algorithms are still underexplored. In this thesis, I  aim at filling this gap by exploring three interesting properties of predictive coding: First, there exists a variation of predictive coding that is equivalent to backpropagation in supervised learning, second, predictive coding is able to perform powerful associative memories, and third, it is able to train neural networks with graphs of any topology. The first result implies that predictive coding networks can be as accurate as standard ones when used to perform supervised learning tasks. The last two, that they are able to perform tasks with a robustness and flexibility that is lacking in standard deep learning models. I then conclude by discussing future directions of research, such as neural architecture search, novel hardware implementations, and implications in neuroscience. All in all, the results presented in this thesis are coherent with recent trends in the literature, which show that neuroscience-inspired learning methods may have interesting machine learning properties, and that they should  be considered as as a valid alternative to backpropagation. 
first_indexed 2024-03-07T08:04:04Z
format Thesis
id oxford-uuid:bd339f03-e7db-4e0d-a789-c9731315bad5
institution University of Oxford
language English
last_indexed 2024-03-07T08:04:04Z
publishDate 2022
record_format dspace
spelling oxford-uuid:bd339f03-e7db-4e0d-a789-c9731315bad52023-10-16T08:53:39ZLearning and memorization via predictive codingThesishttp://purl.org/coar/resource_type/c_db06uuid:bd339f03-e7db-4e0d-a789-c9731315bad5Deep learning (Machine learning)cognitive scienceEnglishHyrax Deposit2022Salvatori, TLukasiewicz, TCeylan, INeural networks trained with backpropagation achieved impressive results in the last decade. However, training such models requires sequential backward updates and non-local computations, making it challenging to parallelize at scale, implement in novel hardware, and is unlike how learning works in the brain. Neuroscience-inspired learning algorithms, such as predictive coding, have the potential to overcome these limitations and advance beyond current deep learning technologies. This potential, however, has only recently gained the attention of the community. As a consequence, the properties of these algorithms are still underexplored. In this thesis, I  aim at filling this gap by exploring three interesting properties of predictive coding: First, there exists a variation of predictive coding that is equivalent to backpropagation in supervised learning, second, predictive coding is able to perform powerful associative memories, and third, it is able to train neural networks with graphs of any topology. The first result implies that predictive coding networks can be as accurate as standard ones when used to perform supervised learning tasks. The last two, that they are able to perform tasks with a robustness and flexibility that is lacking in standard deep learning models. I then conclude by discussing future directions of research, such as neural architecture search, novel hardware implementations, and implications in neuroscience. All in all, the results presented in this thesis are coherent with recent trends in the literature, which show that neuroscience-inspired learning methods may have interesting machine learning properties, and that they should  be considered as as a valid alternative to backpropagation. 
spellingShingle Deep learning (Machine learning)
cognitive science
Salvatori, T
Learning and memorization via predictive coding
title Learning and memorization via predictive coding
title_full Learning and memorization via predictive coding
title_fullStr Learning and memorization via predictive coding
title_full_unstemmed Learning and memorization via predictive coding
title_short Learning and memorization via predictive coding
title_sort learning and memorization via predictive coding
topic Deep learning (Machine learning)
cognitive science
work_keys_str_mv AT salvatorit learningandmemorizationviapredictivecoding