Synaptic and Nonsynaptic Plasticity Approximating Probabilistic Inference

The brain stores and retrieves information by initiating cascades of molecular changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian plasticity, neuromodulation, and homeostatic synaptic and intrinsic excitability all conspire to form and maintain m...

Full description

Bibliographic Details
Main Authors: Philip Joseph Tully, Matthias H Hennig, Anders eLansner
Format: Article
Language:English
Published: Frontiers Media S.A. 2014-04-01
Series:Frontiers in Synaptic Neuroscience
Subjects:
Online Access:http://journal.frontiersin.org/Journal/10.3389/fnsyn.2014.00008/full
Description
Summary:The brain stores and retrieves information by initiating cascades of molecular changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian plasticity, neuromodulation, and homeostatic synaptic and intrinsic excitability all conspire to form and maintain memories. But it is still unclear how these seemingly redundant mechanisms could jointly orchestrate learning in a more unified system. To address this, we propose a Hebbian learning rule for spiking neurons inspired by Bayesian statistics. Synaptic weights and intrinsic currents are adapted on-line upon arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. We show that the dynamics of these traces readily demonstrate a spike-timing dependence that stably returns to a set-point over long time scales, and that synaptic learning remains competitive despite this stability. Beyond unsupervised learning, we show how linking the traces with an externally driven signal could enable spike-based reinforcement learning. Neuronally, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. We perform spike-based Bayesian learning in a simulated inference task using integrate and fire neurons that are Poisson-firing and fluctuation-driven, similar to the preferred regime of cortical neurons. Our results support the view that neurons can represent information in the form of probability distributions and that probabilistic inference can be a functional by-product of coupled synaptic and nonsynaptic mechanisms operating over several timescales. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert.
ISSN:1663-3563