Cortical Tracking of Continuous Speech Under Bimodal Divided Attention

AbstractSpeech processing often occurs amid competing inputs from other modalities, for example, listening to the radio while driving. We examined the extent to which dividing attention between auditory and visual modalities (bimodal divided attention) impacts neural processing of na...

Full description

Bibliographic Details
Main Authors: Zilong Xie, Christian Brodbeck, Bharath Chandrasekaran
Format: Article
Language:English
Published: The MIT Press 2023-01-01
Series:Neurobiology of Language
Online Access:https://direct.mit.edu/nol/article/4/2/318/114548/Cortical-Tracking-of-Continuous-Speech-Under
_version_ 1827957754177781760
author Zilong Xie
Christian Brodbeck
Bharath Chandrasekaran
author_facet Zilong Xie
Christian Brodbeck
Bharath Chandrasekaran
author_sort Zilong Xie
collection DOAJ
description AbstractSpeech processing often occurs amid competing inputs from other modalities, for example, listening to the radio while driving. We examined the extent to which dividing attention between auditory and visual modalities (bimodal divided attention) impacts neural processing of natural continuous speech from acoustic to linguistic levels of representation. We recorded electroencephalographic (EEG) responses when human participants performed a challenging primary visual task, imposing low or high cognitive load while listening to audiobook stories as a secondary task. The two dual-task conditions were contrasted with an auditory single-task condition in which participants attended to stories while ignoring visual stimuli. Behaviorally, the high load dual-task condition was associated with lower speech comprehension accuracy relative to the other two conditions. We fitted multivariate temporal response function encoding models to predict EEG responses from acoustic and linguistic speech features at different representation levels, including auditory spectrograms and information-theoretic models of sublexical-, word-form-, and sentence-level representations. Neural tracking of most acoustic and linguistic features remained unchanged with increasing dual-task load, despite unambiguous behavioral and neural evidence of the high load dual-task condition being more demanding. Compared to the auditory single-task condition, dual-task conditions selectively reduced neural tracking of only some acoustic and linguistic features, mainly at latencies >200 ms, while earlier latencies were surprisingly unaffected. These findings indicate that behavioral effects of bimodal divided attention on continuous speech processing occur not because of impaired early sensory representations but likely at later cognitive processing stages. Crossmodal attention-related mechanisms may not be uniform across different speech processing levels.
first_indexed 2024-04-09T15:26:04Z
format Article
id doaj.art-99b414402c984788a52c9aa358168d55
institution Directory Open Access Journal
issn 2641-4368
language English
last_indexed 2024-04-09T15:26:04Z
publishDate 2023-01-01
publisher The MIT Press
record_format Article
series Neurobiology of Language
spelling doaj.art-99b414402c984788a52c9aa358168d552023-04-28T18:13:27ZengThe MIT PressNeurobiology of Language2641-43682023-01-014231834310.1162/nol_a_00100Cortical Tracking of Continuous Speech Under Bimodal Divided AttentionZilong Xie0http://orcid.org/0000-0002-6851-7554Christian Brodbeck1http://orcid.org/0000-0001-8380-639XBharath Chandrasekaran2http://orcid.org/0000-0002-3673-9435School of Communication Science and Disorders, Florida State University, Tallahassee, FL, USADepartment of Psychological Sciences, University of Connecticut, Storrs, CT, USADepartment of Communication Science and Disorders, University of Pittsburgh, Pittsburgh, PA, USA AbstractSpeech processing often occurs amid competing inputs from other modalities, for example, listening to the radio while driving. We examined the extent to which dividing attention between auditory and visual modalities (bimodal divided attention) impacts neural processing of natural continuous speech from acoustic to linguistic levels of representation. We recorded electroencephalographic (EEG) responses when human participants performed a challenging primary visual task, imposing low or high cognitive load while listening to audiobook stories as a secondary task. The two dual-task conditions were contrasted with an auditory single-task condition in which participants attended to stories while ignoring visual stimuli. Behaviorally, the high load dual-task condition was associated with lower speech comprehension accuracy relative to the other two conditions. We fitted multivariate temporal response function encoding models to predict EEG responses from acoustic and linguistic speech features at different representation levels, including auditory spectrograms and information-theoretic models of sublexical-, word-form-, and sentence-level representations. Neural tracking of most acoustic and linguistic features remained unchanged with increasing dual-task load, despite unambiguous behavioral and neural evidence of the high load dual-task condition being more demanding. Compared to the auditory single-task condition, dual-task conditions selectively reduced neural tracking of only some acoustic and linguistic features, mainly at latencies >200 ms, while earlier latencies were surprisingly unaffected. These findings indicate that behavioral effects of bimodal divided attention on continuous speech processing occur not because of impaired early sensory representations but likely at later cognitive processing stages. Crossmodal attention-related mechanisms may not be uniform across different speech processing levels.https://direct.mit.edu/nol/article/4/2/318/114548/Cortical-Tracking-of-Continuous-Speech-Under
spellingShingle Zilong Xie
Christian Brodbeck
Bharath Chandrasekaran
Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
Neurobiology of Language
title Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
title_full Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
title_fullStr Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
title_full_unstemmed Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
title_short Cortical Tracking of Continuous Speech Under Bimodal Divided Attention
title_sort cortical tracking of continuous speech under bimodal divided attention
url https://direct.mit.edu/nol/article/4/2/318/114548/Cortical-Tracking-of-Continuous-Speech-Under
work_keys_str_mv AT zilongxie corticaltrackingofcontinuousspeechunderbimodaldividedattention
AT christianbrodbeck corticaltrackingofcontinuousspeechunderbimodaldividedattention
AT bharathchandrasekaran corticaltrackingofcontinuousspeechunderbimodaldividedattention