An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity

To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error back-propagation algorithm. However, in the back-propagation algorithm, th...

Full description

Bibliographic Details
Main Authors: Whittington, J, Bogacz, R
Format: Journal article
Published: Massachusetts Institute of Technology Press 2017
_version_ 1797100628767408128
author Whittington, J
Bogacz, R
author_facet Whittington, J
Bogacz, R
author_sort Whittington, J
collection OXFORD
description To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error back-propagation algorithm. However, in the back-propagation algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of pre-synaptic and post-synaptic neurons. Several models have been proposed that approximate the back-propagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the back-propagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.
first_indexed 2024-03-07T05:40:16Z
format Journal article
id oxford-uuid:e555da94-22c2-49e5-abdd-fb60f871c03c
institution University of Oxford
last_indexed 2024-03-07T05:40:16Z
publishDate 2017
publisher Massachusetts Institute of Technology Press
record_format dspace
spelling oxford-uuid:e555da94-22c2-49e5-abdd-fb60f871c03c2022-03-27T10:23:12ZAn approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticityJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:e555da94-22c2-49e5-abdd-fb60f871c03cSymplectic Elements at OxfordMassachusetts Institute of Technology Press2017Whittington, JBogacz, RTo efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error back-propagation algorithm. However, in the back-propagation algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of pre-synaptic and post-synaptic neurons. Several models have been proposed that approximate the back-propagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the back-propagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.
spellingShingle Whittington, J
Bogacz, R
An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title_full An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title_fullStr An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title_full_unstemmed An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title_short An approximation of the error back-propagation algorithm in a predictive coding network with local Hebbian synaptic plasticity
title_sort approximation of the error back propagation algorithm in a predictive coding network with local hebbian synaptic plasticity
work_keys_str_mv AT whittingtonj anapproximationoftheerrorbackpropagationalgorithminapredictivecodingnetworkwithlocalhebbiansynapticplasticity
AT bogaczr anapproximationoftheerrorbackpropagationalgorithminapredictivecodingnetworkwithlocalhebbiansynapticplasticity
AT whittingtonj approximationoftheerrorbackpropagationalgorithminapredictivecodingnetworkwithlocalhebbiansynapticplasticity
AT bogaczr approximationoftheerrorbackpropagationalgorithminapredictivecodingnetworkwithlocalhebbiansynapticplasticity