High-performing neural network models of visual cortex benefit from high latent dimensionality.

Geometric descriptions of deep neural networks (DNNs) have the potential to uncover core representational principles of computational models in neuroscience. Here we examined the geometry of DNN models of visual cortex by quantifying the latent dimensionality of their natural image representations....

Full description

Bibliographic Details
Main Authors: Eric Elmoznino, Michael F Bonner
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2024-01-01
Series:PLoS Computational Biology
Online Access:https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011792&type=printable
_version_ 1797342269383114752
author Eric Elmoznino
Michael F Bonner
author_facet Eric Elmoznino
Michael F Bonner
author_sort Eric Elmoznino
collection DOAJ
description Geometric descriptions of deep neural networks (DNNs) have the potential to uncover core representational principles of computational models in neuroscience. Here we examined the geometry of DNN models of visual cortex by quantifying the latent dimensionality of their natural image representations. A popular view holds that optimal DNNs compress their representations onto low-dimensional subspaces to achieve invariance and robustness, which suggests that better models of visual cortex should have lower dimensional geometries. Surprisingly, we found a strong trend in the opposite direction-neural networks with high-dimensional image subspaces tended to have better generalization performance when predicting cortical responses to held-out stimuli in both monkey electrophysiology and human fMRI data. Moreover, we found that high dimensionality was associated with better performance when learning new categories of stimuli, suggesting that higher dimensional representations are better suited to generalize beyond their training domains. These findings suggest a general principle whereby high-dimensional geometry confers computational benefits to DNN models of visual cortex.
first_indexed 2024-03-08T10:30:52Z
format Article
id doaj.art-d9466ea313b344f58ee5524b74839c77
institution Directory Open Access Journal
issn 1553-734X
1553-7358
language English
last_indexed 2024-03-08T10:30:52Z
publishDate 2024-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS Computational Biology
spelling doaj.art-d9466ea313b344f58ee5524b74839c772024-01-27T05:31:01ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582024-01-01201e101179210.1371/journal.pcbi.1011792High-performing neural network models of visual cortex benefit from high latent dimensionality.Eric ElmozninoMichael F BonnerGeometric descriptions of deep neural networks (DNNs) have the potential to uncover core representational principles of computational models in neuroscience. Here we examined the geometry of DNN models of visual cortex by quantifying the latent dimensionality of their natural image representations. A popular view holds that optimal DNNs compress their representations onto low-dimensional subspaces to achieve invariance and robustness, which suggests that better models of visual cortex should have lower dimensional geometries. Surprisingly, we found a strong trend in the opposite direction-neural networks with high-dimensional image subspaces tended to have better generalization performance when predicting cortical responses to held-out stimuli in both monkey electrophysiology and human fMRI data. Moreover, we found that high dimensionality was associated with better performance when learning new categories of stimuli, suggesting that higher dimensional representations are better suited to generalize beyond their training domains. These findings suggest a general principle whereby high-dimensional geometry confers computational benefits to DNN models of visual cortex.https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011792&type=printable
spellingShingle Eric Elmoznino
Michael F Bonner
High-performing neural network models of visual cortex benefit from high latent dimensionality.
PLoS Computational Biology
title High-performing neural network models of visual cortex benefit from high latent dimensionality.
title_full High-performing neural network models of visual cortex benefit from high latent dimensionality.
title_fullStr High-performing neural network models of visual cortex benefit from high latent dimensionality.
title_full_unstemmed High-performing neural network models of visual cortex benefit from high latent dimensionality.
title_short High-performing neural network models of visual cortex benefit from high latent dimensionality.
title_sort high performing neural network models of visual cortex benefit from high latent dimensionality
url https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1011792&type=printable
work_keys_str_mv AT ericelmoznino highperformingneuralnetworkmodelsofvisualcortexbenefitfromhighlatentdimensionality
AT michaelfbonner highperformingneuralnetworkmodelsofvisualcortexbenefitfromhighlatentdimensionality