Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network
Memories in neural systems are shaped through the interplay of neural and learning dynamics under external inputs. This interplay can result in either overwriting or strengthening of memories as the system is repeatedly exposed to multiple input-output mappings, but it is unclear which effect domina...
Main Authors: | Tomoki Kurikawa, Omri Barak, Kunihiko Kaneko |
---|---|
Format: | Article |
Language: | English |
Published: |
American Physical Society
2020-06-01
|
Series: | Physical Review Research |
Online Access: | http://doi.org/10.1103/PhysRevResearch.2.023307 |
Similar Items
-
Solvable neural network model for input-output associations: Optimal recall at the onset of chaos
by: Tomoki Kurikawa, et al.
Published: (2023-12-01) -
Short-term memory by transient oscillatory dynamics in recurrent neural networks
by: Kohei Ichikawa, et al.
Published: (2021-08-01) -
Trained recurrent neural networks develop phase-locked limit cycles in a working memory task.
by: Matthijs Pals, et al.
Published: (2024-02-01) -
VHDL design and CPLD implementation of decorrelator /
by: 263059 Tang, Siang Lin
Published: (1999) -
A Deep Neural Network Regularization Measure: The Class-Based Decorrelation Method
by: Chenguang Zhang, et al.
Published: (2023-12-01)