The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs

A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger inte...

Full description

Bibliographic Details
Main Authors: Natalie Layer, Anna Weglage, Verena Müller, Hartmut Meister, Ruth Lang-Roth, Martin Walger, Micah M. Murray, Pascale Sandmann
Format: Article
Language:English
Published: Elsevier 2022-01-01
Series:NeuroImage: Clinical
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S221315822200047X
_version_ 1828201160058601472
author Natalie Layer
Anna Weglage
Verena Müller
Hartmut Meister
Ruth Lang-Roth
Martin Walger
Micah M. Murray
Pascale Sandmann
author_facet Natalie Layer
Anna Weglage
Verena Müller
Hartmut Meister
Ruth Lang-Roth
Martin Walger
Micah M. Murray
Pascale Sandmann
author_sort Natalie Layer
collection DOAJ
description A cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger interaction between the auditory and visual system. To better understand the electrophysiological correlates of audiovisual speech perception, the present study used electroencephalography (EEG) and a redundant target paradigm. Postlingually deafened CI users and normal-hearing (NH) listeners were compared in auditory, visual and audiovisual speech conditions. The behavioural results revealed multisensory integration for both groups, as indicated by shortened response times for the audiovisual as compared to the two unisensory conditions. The analysis of the N1 and P2 event-related potentials (ERPs), including topographic and source analyses, confirmed a multisensory effect for both groups and showed a cortical auditory response which was modulated by the simultaneous processing of the visual stimulus. Nevertheless, the CI users in particular revealed a distinct pattern of N1 topography, pointing to a strong visual impact on auditory speech processing. Apart from these condition effects, the results revealed ERP differences between CI users and NH listeners, not only in N1/P2 ERP topographies, but also in the cortical source configuration. When compared to the NH listeners, the CI users showed an additional activation in the visual cortex at N1 latency, which was positively correlated with CI experience, and a delayed auditory-cortex activation with a reversed, rightward functional lateralisation. In sum, our behavioural and ERP findings demonstrate a clear audiovisual benefit for both groups, and a CI-specific alteration in cortical activation at N1 latency when auditory and visual input is combined. These cortical alterations may reflect a compensatory strategy to overcome the limited CI input, which allows the CI users to improve the lip-reading skills and to approximate the behavioural performance of NH listeners in audiovisual speech conditions. Our results are clinically relevant, as they highlight the importance of assessing the CI outcome not only in auditory-only, but also in audiovisual speech conditions.
first_indexed 2024-04-12T11:25:14Z
format Article
id doaj.art-294a1f5dbd214b7aae808eaf649d42d4
institution Directory Open Access Journal
issn 2213-1582
language English
last_indexed 2024-04-12T11:25:14Z
publishDate 2022-01-01
publisher Elsevier
record_format Article
series NeuroImage: Clinical
spelling doaj.art-294a1f5dbd214b7aae808eaf649d42d42022-12-22T03:35:14ZengElsevierNeuroImage: Clinical2213-15822022-01-0134102982The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPsNatalie Layer0Anna Weglage1Verena Müller2Hartmut Meister3Ruth Lang-Roth4Martin Walger5Micah M. Murray6Pascale Sandmann7University of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, Germany; Corresponding author.University of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, GermanyUniversity of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, GermanyJean-Uhrmacher-Institute for Clinical ENT Research, University of Cologne, GermanyUniversity of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, GermanyUniversity of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, Germany; Jean-Uhrmacher-Institute for Clinical ENT Research, University of Cologne, GermanyThe Sense Innovation and Research Center, Lausanne and Sion, Switzerland; The LINE (The Laboratory for Investigative Neurophysiology), Department of Radiology, Lausanne University Hospital and University of Lausanne, Lausanne, Switzerland; CIBM Center for Biomedical Imaging of Lausanne and Geneva, Lausanne, Switzerland; Department of Hearing and Speech Sciences, Vanderbilt University, Nashville, TN, USAUniversity of Cologne, Faculty of Medicine and University Hospital Cologne, Department of Otorhinolaryngology, Head and Neck Surgery, Audiology and Pediatric Audiology, Cochlear Implant Center, GermanyA cochlear implant (CI) is an auditory prosthesis which can partially restore the auditory function in patients with severe to profound hearing loss. However, this bionic device provides only limited auditory information, and CI patients may compensate for this limitation by means of a stronger interaction between the auditory and visual system. To better understand the electrophysiological correlates of audiovisual speech perception, the present study used electroencephalography (EEG) and a redundant target paradigm. Postlingually deafened CI users and normal-hearing (NH) listeners were compared in auditory, visual and audiovisual speech conditions. The behavioural results revealed multisensory integration for both groups, as indicated by shortened response times for the audiovisual as compared to the two unisensory conditions. The analysis of the N1 and P2 event-related potentials (ERPs), including topographic and source analyses, confirmed a multisensory effect for both groups and showed a cortical auditory response which was modulated by the simultaneous processing of the visual stimulus. Nevertheless, the CI users in particular revealed a distinct pattern of N1 topography, pointing to a strong visual impact on auditory speech processing. Apart from these condition effects, the results revealed ERP differences between CI users and NH listeners, not only in N1/P2 ERP topographies, but also in the cortical source configuration. When compared to the NH listeners, the CI users showed an additional activation in the visual cortex at N1 latency, which was positively correlated with CI experience, and a delayed auditory-cortex activation with a reversed, rightward functional lateralisation. In sum, our behavioural and ERP findings demonstrate a clear audiovisual benefit for both groups, and a CI-specific alteration in cortical activation at N1 latency when auditory and visual input is combined. These cortical alterations may reflect a compensatory strategy to overcome the limited CI input, which allows the CI users to improve the lip-reading skills and to approximate the behavioural performance of NH listeners in audiovisual speech conditions. Our results are clinically relevant, as they highlight the importance of assessing the CI outcome not only in auditory-only, but also in audiovisual speech conditions.http://www.sciencedirect.com/science/article/pii/S221315822200047XCochlear implantEvent-related potentialCortical plasticityMultisensory integrationAudiovisual interactionAudiovisual speech perception
spellingShingle Natalie Layer
Anna Weglage
Verena Müller
Hartmut Meister
Ruth Lang-Roth
Martin Walger
Micah M. Murray
Pascale Sandmann
The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
NeuroImage: Clinical
Cochlear implant
Event-related potential
Cortical plasticity
Multisensory integration
Audiovisual interaction
Audiovisual speech perception
title The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
title_full The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
title_fullStr The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
title_full_unstemmed The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
title_short The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs
title_sort timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by erps
topic Cochlear implant
Event-related potential
Cortical plasticity
Multisensory integration
Audiovisual interaction
Audiovisual speech perception
url http://www.sciencedirect.com/science/article/pii/S221315822200047X
work_keys_str_mv AT natalielayer thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT annaweglage thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT verenamuller thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT hartmutmeister thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT ruthlangroth thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT martinwalger thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT micahmmurray thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT pascalesandmann thetimecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT natalielayer timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT annaweglage timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT verenamuller timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT hartmutmeister timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT ruthlangroth timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT martinwalger timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT micahmmurray timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps
AT pascalesandmann timecourseofmultisensoryspeechprocessinginunilaterallystimulatedcochlearimplantusersrevealedbyerps