The cortical representation of language timescales is shared between reading and listening
Abstract Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether these brain representations are shared between wr...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2024-03-01
|
Series: | Communications Biology |
Online Access: | https://doi.org/10.1038/s42003-024-05909-z |
_version_ | 1797266655804391424 |
---|---|
author | Catherine Chen Tom Dupré la Tour Jack L. Gallant Daniel Klein Fatma Deniz |
author_facet | Catherine Chen Tom Dupré la Tour Jack L. Gallant Daniel Klein Fatma Deniz |
author_sort | Catherine Chen |
collection | DOAJ |
description | Abstract Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether these brain representations are shared between written and spoken language. To address this issue, we analyze fMRI BOLD data that were recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy are operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models are used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality. |
first_indexed | 2024-04-25T01:04:09Z |
format | Article |
id | doaj.art-b5e7aa6527134ca38a3a5dbb15fa1601 |
institution | Directory Open Access Journal |
issn | 2399-3642 |
language | English |
last_indexed | 2024-04-25T01:04:09Z |
publishDate | 2024-03-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Communications Biology |
spelling | doaj.art-b5e7aa6527134ca38a3a5dbb15fa16012024-03-10T12:19:52ZengNature PortfolioCommunications Biology2399-36422024-03-017111310.1038/s42003-024-05909-zThe cortical representation of language timescales is shared between reading and listeningCatherine Chen0Tom Dupré la Tour1Jack L. Gallant2Daniel Klein3Fatma Deniz4Department of Electrical Engineering and Computer Sciences, University of CaliforniaHelen Wills Neuroscience Institute, University of CaliforniaHelen Wills Neuroscience Institute, University of CaliforniaDepartment of Electrical Engineering and Computer Sciences, University of CaliforniaHelen Wills Neuroscience Institute, University of CaliforniaAbstract Language comprehension involves integrating low-level sensory inputs into a hierarchy of increasingly high-level features. Prior work studied brain representations of different levels of the language hierarchy, but has not determined whether these brain representations are shared between written and spoken language. To address this issue, we analyze fMRI BOLD data that were recorded while participants read and listened to the same narratives in each modality. Levels of the language hierarchy are operationalized as timescales, where each timescale refers to a set of spectral components of a language stimulus. Voxelwise encoding models are used to determine where different timescales are represented across the cerebral cortex, for each modality separately. These models reveal that between the two modalities timescale representations are organized similarly across the cortical surface. Our results suggest that, after low-level sensory processing, language integration proceeds similarly regardless of stimulus modality.https://doi.org/10.1038/s42003-024-05909-z |
spellingShingle | Catherine Chen Tom Dupré la Tour Jack L. Gallant Daniel Klein Fatma Deniz The cortical representation of language timescales is shared between reading and listening Communications Biology |
title | The cortical representation of language timescales is shared between reading and listening |
title_full | The cortical representation of language timescales is shared between reading and listening |
title_fullStr | The cortical representation of language timescales is shared between reading and listening |
title_full_unstemmed | The cortical representation of language timescales is shared between reading and listening |
title_short | The cortical representation of language timescales is shared between reading and listening |
title_sort | cortical representation of language timescales is shared between reading and listening |
url | https://doi.org/10.1038/s42003-024-05909-z |
work_keys_str_mv | AT catherinechen thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT tomduprelatour thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT jacklgallant thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT danielklein thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT fatmadeniz thecorticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT catherinechen corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT tomduprelatour corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT jacklgallant corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT danielklein corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening AT fatmadeniz corticalrepresentationoflanguagetimescalesissharedbetweenreadingandlistening |