Memory capacity of networks with stochastic binary synapses.

In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored,...

Full description

Bibliographic Details
Main Authors: Alexis M Dubreuil, Yali Amit, Nicolas Brunel
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2014-08-01
Series:PLoS Computational Biology
Online Access:http://europepmc.org/articles/PMC4125071?pdf=render
_version_ 1818018868862386176
author Alexis M Dubreuil
Yali Amit
Nicolas Brunel
author_facet Alexis M Dubreuil
Yali Amit
Nicolas Brunel
author_sort Alexis M Dubreuil
collection DOAJ
description In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level [Formula: see text], in the large [Formula: see text] and sparse coding limits ([Formula: see text]). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.
first_indexed 2024-04-14T07:45:21Z
format Article
id doaj.art-970f232fa3bb43508240939f9c78e009
institution Directory Open Access Journal
issn 1553-734X
1553-7358
language English
last_indexed 2024-04-14T07:45:21Z
publishDate 2014-08-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS Computational Biology
spelling doaj.art-970f232fa3bb43508240939f9c78e0092022-12-22T02:05:21ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582014-08-01108e100372710.1371/journal.pcbi.1003727Memory capacity of networks with stochastic binary synapses.Alexis M DubreuilYali AmitNicolas BrunelIn standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level [Formula: see text], in the large [Formula: see text] and sparse coding limits ([Formula: see text]). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.http://europepmc.org/articles/PMC4125071?pdf=render
spellingShingle Alexis M Dubreuil
Yali Amit
Nicolas Brunel
Memory capacity of networks with stochastic binary synapses.
PLoS Computational Biology
title Memory capacity of networks with stochastic binary synapses.
title_full Memory capacity of networks with stochastic binary synapses.
title_fullStr Memory capacity of networks with stochastic binary synapses.
title_full_unstemmed Memory capacity of networks with stochastic binary synapses.
title_short Memory capacity of networks with stochastic binary synapses.
title_sort memory capacity of networks with stochastic binary synapses
url http://europepmc.org/articles/PMC4125071?pdf=render
work_keys_str_mv AT alexismdubreuil memorycapacityofnetworkswithstochasticbinarysynapses
AT yaliamit memorycapacityofnetworkswithstochasticbinarysynapses
AT nicolasbrunel memorycapacityofnetworkswithstochasticbinarysynapses