Asymptotic Description of Neural Networks with Correlated Synaptic Weights
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neuron...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2015-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/17/7/4701 |
_version_ | 1798005444933844992 |
---|---|
author | Olivier Faugeras James MacLaurin |
author_facet | Olivier Faugeras James MacLaurin |
author_sort | Olivier Faugeras |
collection | DOAJ |
description | We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories. |
first_indexed | 2024-04-11T12:39:20Z |
format | Article |
id | doaj.art-e7d1cec045784b8ab9a382928e497c20 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-11T12:39:20Z |
publishDate | 2015-07-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-e7d1cec045784b8ab9a382928e497c202022-12-22T04:23:32ZengMDPI AGEntropy1099-43002015-07-011774701474310.3390/e17074701e17074701Asymptotic Description of Neural Networks with Correlated Synaptic WeightsOlivier Faugeras0James MacLaurin1INRIA Sophia Antipolis Mediterannee, 2004 Route Des Lucioles, Sophia Antipolis, 06410, FranceINRIA Sophia Antipolis Mediterannee, 2004 Route Des Lucioles, Sophia Antipolis, 06410, FranceWe study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.http://www.mdpi.com/1099-4300/17/7/4701large deviationsgood rate functionstationary gaussian processesstationary measuresspectral representationsneural networksfiring rate neuronscorrelated synaptic weights |
spellingShingle | Olivier Faugeras James MacLaurin Asymptotic Description of Neural Networks with Correlated Synaptic Weights Entropy large deviations good rate function stationary gaussian processes stationary measures spectral representations neural networks firing rate neurons correlated synaptic weights |
title | Asymptotic Description of Neural Networks with Correlated Synaptic Weights |
title_full | Asymptotic Description of Neural Networks with Correlated Synaptic Weights |
title_fullStr | Asymptotic Description of Neural Networks with Correlated Synaptic Weights |
title_full_unstemmed | Asymptotic Description of Neural Networks with Correlated Synaptic Weights |
title_short | Asymptotic Description of Neural Networks with Correlated Synaptic Weights |
title_sort | asymptotic description of neural networks with correlated synaptic weights |
topic | large deviations good rate function stationary gaussian processes stationary measures spectral representations neural networks firing rate neurons correlated synaptic weights |
url | http://www.mdpi.com/1099-4300/17/7/4701 |
work_keys_str_mv | AT olivierfaugeras asymptoticdescriptionofneuralnetworkswithcorrelatedsynapticweights AT jamesmaclaurin asymptoticdescriptionofneuralnetworkswithcorrelatedsynapticweights |