Recurrent predictive coding models for associative memory employing covariance learning

<p>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a uni...

Full description

Bibliographic Details
Main Authors: Tang, M, Salvatori, T, Millidge, B, Song, Y, Lukasiewicz, T, Bogacz, R
Format: Journal article
Language:English
Published: Public Library of Science 2023
_version_ 1797109975016800256
author Tang, M
Salvatori, T
Millidge, B
Song, Y
Lukasiewicz, T
Bogacz, R
author_facet Tang, M
Salvatori, T
Millidge, B
Song, Y
Lukasiewicz, T
Bogacz, R
author_sort Tang, M
collection OXFORD
description <p>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus.</p>
first_indexed 2024-03-07T07:48:43Z
format Journal article
id oxford-uuid:9bc6c745-4bb7-4e80-9f1a-985180110247
institution University of Oxford
language English
last_indexed 2024-03-07T07:48:43Z
publishDate 2023
publisher Public Library of Science
record_format dspace
spelling oxford-uuid:9bc6c745-4bb7-4e80-9f1a-9851801102472023-06-26T11:55:55ZRecurrent predictive coding models for associative memory employing covariance learningJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:9bc6c745-4bb7-4e80-9f1a-985180110247EnglishSymplectic ElementsPublic Library of Science2023Tang, MSalvatori, TMillidge, BSong, YLukasiewicz, TBogacz, R<p>The computational principles adopted by the hippocampus in associative memory (AM) tasks have been one of the most studied topics in computational and theoretical neuroscience. Recent theories suggested that AM and the predictive activities of the hippocampus could be described within a unitary account, and that predictive coding underlies the computations supporting AM in the hippocampus. Following this theory, a computational model based on classical hierarchical predictive networks was proposed and was shown to perform well in various AM tasks. However, this fully hierarchical model did not incorporate recurrent connections, an architectural component of the CA3 region of the hippocampus that is crucial for AM. This makes the structure of the model inconsistent with the known connectivity of CA3 and classical recurrent models such as Hopfield Networks, which learn the covariance of inputs through their recurrent connections to perform AM. Earlier PC models that learn the covariance information of inputs explicitly via recurrent connections seem to be a solution to these issues. Here, we show that although these models can perform AM, they do it in an implausible and numerically unstable way. Instead, we propose alternatives to these earlier covariance-learning predictive coding networks, which learn the covariance information implicitly and plausibly, and can use dendritic structures to encode prediction errors. We show analytically that our proposed models are perfectly equivalent to the earlier predictive coding model learning covariance explicitly, and encounter no numerical issues when performing AM tasks in practice. We further show that our models can be combined with hierarchical predictive coding networks to model the hippocampo-neocortical interactions. Our models provide a biologically plausible approach to modelling the hippocampal network, pointing to a potential computational mechanism during hippocampal memory formation and recall, which employs both predictive coding and covariance learning based on the recurrent network structure of the hippocampus.</p>
spellingShingle Tang, M
Salvatori, T
Millidge, B
Song, Y
Lukasiewicz, T
Bogacz, R
Recurrent predictive coding models for associative memory employing covariance learning
title Recurrent predictive coding models for associative memory employing covariance learning
title_full Recurrent predictive coding models for associative memory employing covariance learning
title_fullStr Recurrent predictive coding models for associative memory employing covariance learning
title_full_unstemmed Recurrent predictive coding models for associative memory employing covariance learning
title_short Recurrent predictive coding models for associative memory employing covariance learning
title_sort recurrent predictive coding models for associative memory employing covariance learning
work_keys_str_mv AT tangm recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning
AT salvatorit recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning
AT millidgeb recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning
AT songy recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning
AT lukasiewiczt recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning
AT bogaczr recurrentpredictivecodingmodelsforassociativememoryemployingcovariancelearning