Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks

Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in thei...

Full description

Bibliographic Details
Main Authors: Rodrigo F. O. Pena, Sebastian Vellmer, Davide Bernardi, Antonio C. Roque, Benjamin Lindner
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-03-01
Series:Frontiers in Computational Neuroscience
Subjects:
Online Access:http://journal.frontiersin.org/article/10.3389/fncom.2018.00009/full
_version_ 1819264809378512896
author Rodrigo F. O. Pena
Sebastian Vellmer
Sebastian Vellmer
Davide Bernardi
Davide Bernardi
Antonio C. Roque
Benjamin Lindner
Benjamin Lindner
author_facet Rodrigo F. O. Pena
Sebastian Vellmer
Sebastian Vellmer
Davide Bernardi
Davide Bernardi
Antonio C. Roque
Benjamin Lindner
Benjamin Lindner
author_sort Rodrigo F. O. Pena
collection DOAJ
description Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.
first_indexed 2024-12-23T20:35:23Z
format Article
id doaj.art-785a420ff35c476a868c283dbc46f8d5
institution Directory Open Access Journal
issn 1662-5188
language English
last_indexed 2024-12-23T20:35:23Z
publishDate 2018-03-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Computational Neuroscience
spelling doaj.art-785a420ff35c476a868c283dbc46f8d52022-12-21T17:32:06ZengFrontiers Media S.A.Frontiers in Computational Neuroscience1662-51882018-03-011210.3389/fncom.2018.00009315556Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse NetworksRodrigo F. O. Pena0Sebastian Vellmer1Sebastian Vellmer2Davide Bernardi3Davide Bernardi4Antonio C. Roque5Benjamin Lindner6Benjamin Lindner7Laboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, BrazilTheory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, GermanyDepartment of Physics, Humboldt Universität zu Berlin, Berlin, GermanyTheory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, GermanyDepartment of Physics, Humboldt Universität zu Berlin, Berlin, GermanyLaboratório de Sistemas Neurais, Department of Physics, School of Philosophy, Sciences and Letters of Ribeirão Preto, University of São Paulo, São Paulo, BrazilTheory of Complex Systems and Neurophysics, Bernstein Center for Computational Neuroscience, Berlin, GermanyDepartment of Physics, Humboldt Universität zu Berlin, Berlin, GermanyRecurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as indicated by comparison with simulation results of large recurrent networks. Our method can help to elucidate how network heterogeneity shapes the asynchronous state in recurrent neural networks.http://journal.frontiersin.org/article/10.3389/fncom.2018.00009/fullcomplex networksstochastic modelsneural noiserecurrent neural networksneural dynamicsspike-train statistics
spellingShingle Rodrigo F. O. Pena
Sebastian Vellmer
Sebastian Vellmer
Davide Bernardi
Davide Bernardi
Antonio C. Roque
Benjamin Lindner
Benjamin Lindner
Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
Frontiers in Computational Neuroscience
complex networks
stochastic models
neural noise
recurrent neural networks
neural dynamics
spike-train statistics
title Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
title_full Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
title_fullStr Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
title_full_unstemmed Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
title_short Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks
title_sort self consistent scheme for spike train power spectra in heterogeneous sparse networks
topic complex networks
stochastic models
neural noise
recurrent neural networks
neural dynamics
spike-train statistics
url http://journal.frontiersin.org/article/10.3389/fncom.2018.00009/full
work_keys_str_mv AT rodrigofopena selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT sebastianvellmer selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT sebastianvellmer selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT davidebernardi selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT davidebernardi selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT antoniocroque selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT benjaminlindner selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks
AT benjaminlindner selfconsistentschemeforspiketrainpowerspectrainheterogeneoussparsenetworks