Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.

Neural oscillations are ubiquitously observed in many brain areas. One proposed functional role of these oscillations is that they serve as an internal clock, or 'frame of reference'. Information can be encoded by the timing of neural activity relative to the phase of such oscillations. In...

詳細記述

書誌詳細
主要な著者: Matthijs Pals, Jakob H Macke, Omri Barak
フォーマット: 論文
言語:English
出版事項: Public Library of Science (PLoS) 2024-02-01
シリーズ:PLoS Computational Biology
オンライン・アクセス:https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011852&type=printable
_version_ 1827013085094739968
author Matthijs Pals
Matthijs Pals
Jakob H Macke
Omri Barak
author_facet Matthijs Pals
Matthijs Pals
Jakob H Macke
Omri Barak
author_sort Matthijs Pals
collection DOAJ
description Neural oscillations are ubiquitously observed in many brain areas. One proposed functional role of these oscillations is that they serve as an internal clock, or 'frame of reference'. Information can be encoded by the timing of neural activity relative to the phase of such oscillations. In line with this hypothesis, there have been multiple empirical observations of such phase codes in the brain. Here we ask: What kind of neural dynamics support phase coding of information with neural oscillations? We tackled this question by analyzing recurrent neural networks (RNNs) that were trained on a working memory task. The networks were given access to an external reference oscillation and tasked to produce an oscillation, such that the phase difference between the reference and output oscillation maintains the identity of transient stimuli. We found that networks converged to stable oscillatory dynamics. Reverse engineering these networks revealed that each phase-coded memory corresponds to a separate limit cycle attractor. We characterized how the stability of the attractor dynamics depends on both reference oscillation amplitude and frequency, properties that can be experimentally observed. To understand the connectivity structures that underlie these dynamics, we showed that trained networks can be described as two phase-coupled oscillators. Using this insight, we condensed our trained networks to a reduced model consisting of two functional modules: One that generates an oscillation and one that implements a coupling function between the internal oscillation and external reference. In summary, by reverse engineering the dynamics and connectivity of trained RNNs, we propose a mechanism by which neural networks can harness reference oscillations for working memory. Specifically, we propose that a phase-coding network generates autonomous oscillations which it couples to an external reference oscillation in a multi-stable fashion.
first_indexed 2024-03-07T22:00:25Z
format Article
id doaj.art-c7ca0f7c32aa41d8bb5f6dadf141010b
institution Directory Open Access Journal
issn 1553-734X
1553-7358
language English
last_indexed 2025-02-18T13:48:18Z
publishDate 2024-02-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS Computational Biology
spelling doaj.art-c7ca0f7c32aa41d8bb5f6dadf141010b2024-10-30T05:31:07ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582024-02-01202e101185210.1371/journal.pcbi.1011852Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.Matthijs PalsMatthijs PalsJakob H MackeOmri BarakNeural oscillations are ubiquitously observed in many brain areas. One proposed functional role of these oscillations is that they serve as an internal clock, or 'frame of reference'. Information can be encoded by the timing of neural activity relative to the phase of such oscillations. In line with this hypothesis, there have been multiple empirical observations of such phase codes in the brain. Here we ask: What kind of neural dynamics support phase coding of information with neural oscillations? We tackled this question by analyzing recurrent neural networks (RNNs) that were trained on a working memory task. The networks were given access to an external reference oscillation and tasked to produce an oscillation, such that the phase difference between the reference and output oscillation maintains the identity of transient stimuli. We found that networks converged to stable oscillatory dynamics. Reverse engineering these networks revealed that each phase-coded memory corresponds to a separate limit cycle attractor. We characterized how the stability of the attractor dynamics depends on both reference oscillation amplitude and frequency, properties that can be experimentally observed. To understand the connectivity structures that underlie these dynamics, we showed that trained networks can be described as two phase-coupled oscillators. Using this insight, we condensed our trained networks to a reduced model consisting of two functional modules: One that generates an oscillation and one that implements a coupling function between the internal oscillation and external reference. In summary, by reverse engineering the dynamics and connectivity of trained RNNs, we propose a mechanism by which neural networks can harness reference oscillations for working memory. Specifically, we propose that a phase-coding network generates autonomous oscillations which it couples to an external reference oscillation in a multi-stable fashion.https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011852&type=printable
spellingShingle Matthijs Pals
Matthijs Pals
Jakob H Macke
Omri Barak
Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
PLoS Computational Biology
title Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
title_full Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
title_fullStr Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
title_full_unstemmed Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
title_short Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
title_sort trained recurrent neural networks develop phase locked limit cycles in a working memory task
url https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011852&type=printable
work_keys_str_mv AT matthijspals trainedrecurrentneuralnetworksdevelopphaselockedlimitcyclesinaworkingmemorytask
AT matthijspals trainedrecurrentneuralnetworksdevelopphaselockedlimitcyclesinaworkingmemorytask
AT jakobhmacke trainedrecurrentneuralnetworksdevelopphaselockedlimitcyclesinaworkingmemorytask
AT omribarak trainedrecurrentneuralnetworksdevelopphaselockedlimitcyclesinaworkingmemorytask