Weight statistics controls dynamics in recurrent neural networks.

Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the indiv...

Full description

Bibliographic Details
Main Authors: Patrick Krauss, Marc Schuster, Verena Dietrich, Achim Schilling, Holger Schulze, Claus Metzner
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2019-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0214541
_version_ 1818726947816996864
author Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
author_facet Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
author_sort Patrick Krauss
collection DOAJ
description Recurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamics of recurrent networks of Boltzmann neurons. In particular we study the impact of three statistical parameters: density (the fraction of non-zero connections), balance (the ratio of excitatory to inhibitory connections), and symmetry (the fraction of neuron pairs with wij = wji). By computing a 'phase diagram' of network dynamics, we find that balance is the essential control parameter: Its gradual increase from negative to positive values drives the system from oscillatory behavior into a chaotic regime, and eventually into stationary fixed points. Only directly at the border of the chaotic regime do the neural networks display rich but regular dynamics, thus enabling actual information processing. These results suggest that the brain, too, is fine-tuned to the 'edge of chaos' by assuring a proper balance between excitatory and inhibitory neural connections.
first_indexed 2024-12-17T22:06:18Z
format Article
id doaj.art-fd5f389a92eb43d3b5f1d8fd7dafac71
institution Directory Open Access Journal
issn 1932-6203
language English
last_indexed 2024-12-17T22:06:18Z
publishDate 2019-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj.art-fd5f389a92eb43d3b5f1d8fd7dafac712022-12-21T21:30:51ZengPublic Library of Science (PLoS)PLoS ONE1932-62032019-01-01144e021454110.1371/journal.pone.0214541Weight statistics controls dynamics in recurrent neural networks.Patrick KraussMarc SchusterVerena DietrichAchim SchillingHolger SchulzeClaus MetznerRecurrent neural networks are complex non-linear systems, capable of ongoing activity in the absence of driving inputs. The dynamical properties of these systems, in particular their long-time attractor states, are determined on the microscopic level by the connection strengths wij between the individual neurons. However, little is known to which extent network dynamics is tunable on a more coarse-grained level by the statistical features of the weight matrix. In this work, we investigate the dynamics of recurrent networks of Boltzmann neurons. In particular we study the impact of three statistical parameters: density (the fraction of non-zero connections), balance (the ratio of excitatory to inhibitory connections), and symmetry (the fraction of neuron pairs with wij = wji). By computing a 'phase diagram' of network dynamics, we find that balance is the essential control parameter: Its gradual increase from negative to positive values drives the system from oscillatory behavior into a chaotic regime, and eventually into stationary fixed points. Only directly at the border of the chaotic regime do the neural networks display rich but regular dynamics, thus enabling actual information processing. These results suggest that the brain, too, is fine-tuned to the 'edge of chaos' by assuring a proper balance between excitatory and inhibitory neural connections.https://doi.org/10.1371/journal.pone.0214541
spellingShingle Patrick Krauss
Marc Schuster
Verena Dietrich
Achim Schilling
Holger Schulze
Claus Metzner
Weight statistics controls dynamics in recurrent neural networks.
PLoS ONE
title Weight statistics controls dynamics in recurrent neural networks.
title_full Weight statistics controls dynamics in recurrent neural networks.
title_fullStr Weight statistics controls dynamics in recurrent neural networks.
title_full_unstemmed Weight statistics controls dynamics in recurrent neural networks.
title_short Weight statistics controls dynamics in recurrent neural networks.
title_sort weight statistics controls dynamics in recurrent neural networks
url https://doi.org/10.1371/journal.pone.0214541
work_keys_str_mv AT patrickkrauss weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT marcschuster weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT verenadietrich weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT achimschilling weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT holgerschulze weightstatisticscontrolsdynamicsinrecurrentneuralnetworks
AT clausmetzner weightstatisticscontrolsdynamicsinrecurrentneuralnetworks