Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.

The ability to simultaneously record from large numbers of neurons in behaving animals has ushered in a new era for the study of the neural circuit mechanisms underlying cognitive functions. One promising approach to uncovering the dynamical and computational principles governing population response...

Full description

Bibliographic Details
Main Authors: H Francis Song, Guangyu R Yang, Xiao-Jing Wang
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2016-02-01
Series:PLoS Computational Biology
Online Access:http://europepmc.org/articles/PMC4771709?pdf=render
_version_ 1829480299662147584
author H Francis Song
Guangyu R Yang
Xiao-Jing Wang
author_facet H Francis Song
Guangyu R Yang
Xiao-Jing Wang
author_sort H Francis Song
collection DOAJ
description The ability to simultaneously record from large numbers of neurons in behaving animals has ushered in a new era for the study of the neural circuit mechanisms underlying cognitive functions. One promising approach to uncovering the dynamical and computational principles governing population responses is to analyze model recurrent neural networks (RNNs) that have been optimized to perform the same tasks as behaving animals. Because the optimization of network parameters specifies the desired output but not the manner in which to achieve this output, "trained" networks serve as a source of mechanistic hypotheses and a testing ground for data analyses that link neural computation to behavior. Complete access to the activity and connectivity of the circuit, and the ability to manipulate them arbitrarily, make trained networks a convenient proxy for biological circuits and a valuable platform for theoretical investigation. However, existing RNNs lack basic biological features such as the distinction between excitatory and inhibitory units (Dale's principle), which are essential if RNNs are to provide insights into the operation of biological circuits. Moreover, trained networks can achieve the same behavioral performance but differ substantially in their structure and dynamics, highlighting the need for a simple and flexible framework for the exploratory training of RNNs. Here, we describe a framework for gradient descent-based training of excitatory-inhibitory RNNs that can incorporate a variety of biological knowledge. We provide an implementation based on the machine learning library Theano, whose automatic differentiation capabilities facilitate modifications and extensions. We validate this framework by applying it to well-known experimental paradigms such as perceptual decision-making, context-dependent integration, multisensory integration, parametric working memory, and motor sequence generation. Our results demonstrate the wide range of neural activity patterns and behavior that can be modeled, and suggest a unified setting in which diverse cognitive computations and mechanisms can be studied.
first_indexed 2024-12-14T21:03:23Z
format Article
id doaj.art-a3e652286141464ea7d8339477e3d2da
institution Directory Open Access Journal
issn 1553-734X
1553-7358
language English
last_indexed 2024-12-14T21:03:23Z
publishDate 2016-02-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS Computational Biology
spelling doaj.art-a3e652286141464ea7d8339477e3d2da2022-12-21T22:47:32ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582016-02-01122e100479210.1371/journal.pcbi.1004792Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.H Francis SongGuangyu R YangXiao-Jing WangThe ability to simultaneously record from large numbers of neurons in behaving animals has ushered in a new era for the study of the neural circuit mechanisms underlying cognitive functions. One promising approach to uncovering the dynamical and computational principles governing population responses is to analyze model recurrent neural networks (RNNs) that have been optimized to perform the same tasks as behaving animals. Because the optimization of network parameters specifies the desired output but not the manner in which to achieve this output, "trained" networks serve as a source of mechanistic hypotheses and a testing ground for data analyses that link neural computation to behavior. Complete access to the activity and connectivity of the circuit, and the ability to manipulate them arbitrarily, make trained networks a convenient proxy for biological circuits and a valuable platform for theoretical investigation. However, existing RNNs lack basic biological features such as the distinction between excitatory and inhibitory units (Dale's principle), which are essential if RNNs are to provide insights into the operation of biological circuits. Moreover, trained networks can achieve the same behavioral performance but differ substantially in their structure and dynamics, highlighting the need for a simple and flexible framework for the exploratory training of RNNs. Here, we describe a framework for gradient descent-based training of excitatory-inhibitory RNNs that can incorporate a variety of biological knowledge. We provide an implementation based on the machine learning library Theano, whose automatic differentiation capabilities facilitate modifications and extensions. We validate this framework by applying it to well-known experimental paradigms such as perceptual decision-making, context-dependent integration, multisensory integration, parametric working memory, and motor sequence generation. Our results demonstrate the wide range of neural activity patterns and behavior that can be modeled, and suggest a unified setting in which diverse cognitive computations and mechanisms can be studied.http://europepmc.org/articles/PMC4771709?pdf=render
spellingShingle H Francis Song
Guangyu R Yang
Xiao-Jing Wang
Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
PLoS Computational Biology
title Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
title_full Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
title_fullStr Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
title_full_unstemmed Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
title_short Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
title_sort training excitatory inhibitory recurrent neural networks for cognitive tasks a simple and flexible framework
url http://europepmc.org/articles/PMC4771709?pdf=render
work_keys_str_mv AT hfrancissong trainingexcitatoryinhibitoryrecurrentneuralnetworksforcognitivetasksasimpleandflexibleframework
AT guangyuryang trainingexcitatoryinhibitoryrecurrentneuralnetworksforcognitivetasksasimpleandflexibleframework
AT xiaojingwang trainingexcitatoryinhibitoryrecurrentneuralnetworksforcognitivetasksasimpleandflexibleframework