Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence

In audiovisual speech perception, visual information from a talker's face during mouth articulation is available before the onset of the corresponding audio speech, and thereby allows the perceiver to use visual information to predict the upcoming audio. This prediction from phonetically congru...

Full description

Bibliographic Details
Main Authors: Marzieh Sorati, Dawn Marie Behne
Format: Article
Language:English
Published: Frontiers Media S.A. 2019-11-01
Series:Frontiers in Psychology
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fpsyg.2019.02562/full
_version_ 1818519384621056000
author Marzieh Sorati
Dawn Marie Behne
author_facet Marzieh Sorati
Dawn Marie Behne
author_sort Marzieh Sorati
collection DOAJ
description In audiovisual speech perception, visual information from a talker's face during mouth articulation is available before the onset of the corresponding audio speech, and thereby allows the perceiver to use visual information to predict the upcoming audio. This prediction from phonetically congruent visual information modulates audiovisual speech perception and leads to a decrease in N1 and P2 amplitudes and latencies compared to the perception of audio speech alone. Whether audiovisual experience, such as with musical training, influences this prediction is unclear, but if so, may explain some of the variations observed in previous research. The current study addresses whether audiovisual speech perception is affected by musical training, first assessing N1 and P2 event-related potentials (ERPs) and in addition, inter-trial phase coherence (ITPC). Musicians and non-musicians are presented the syllable, /ba/ in audio only (AO), video only (VO), and audiovisual (AV) conditions. With the predictory effect of mouth movement isolated from the AV speech (AV−VO), results showed that, compared to audio speech, both groups have a lower N1 latency and P2 amplitude and latency. Moreover, they also showed lower ITPCs in the delta, theta, and beta bands in audiovisual speech perception. However, musicians showed significant suppression of N1 amplitude and desynchronization in the alpha band in audiovisual speech, not present for non-musicians. Collectively, the current findings indicate that early sensory processing can be modified by musical experience, which in turn can explain some of the variations in previous AV speech perception research.
first_indexed 2024-12-11T01:23:32Z
format Article
id doaj.art-144049ccf5d24c7da71a6f935097e1e7
institution Directory Open Access Journal
issn 1664-1078
language English
last_indexed 2024-12-11T01:23:32Z
publishDate 2019-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Psychology
spelling doaj.art-144049ccf5d24c7da71a6f935097e1e72022-12-22T01:25:37ZengFrontiers Media S.A.Frontiers in Psychology1664-10782019-11-011010.3389/fpsyg.2019.02562483855Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase CoherenceMarzieh SoratiDawn Marie BehneIn audiovisual speech perception, visual information from a talker's face during mouth articulation is available before the onset of the corresponding audio speech, and thereby allows the perceiver to use visual information to predict the upcoming audio. This prediction from phonetically congruent visual information modulates audiovisual speech perception and leads to a decrease in N1 and P2 amplitudes and latencies compared to the perception of audio speech alone. Whether audiovisual experience, such as with musical training, influences this prediction is unclear, but if so, may explain some of the variations observed in previous research. The current study addresses whether audiovisual speech perception is affected by musical training, first assessing N1 and P2 event-related potentials (ERPs) and in addition, inter-trial phase coherence (ITPC). Musicians and non-musicians are presented the syllable, /ba/ in audio only (AO), video only (VO), and audiovisual (AV) conditions. With the predictory effect of mouth movement isolated from the AV speech (AV−VO), results showed that, compared to audio speech, both groups have a lower N1 latency and P2 amplitude and latency. Moreover, they also showed lower ITPCs in the delta, theta, and beta bands in audiovisual speech perception. However, musicians showed significant suppression of N1 amplitude and desynchronization in the alpha band in audiovisual speech, not present for non-musicians. Collectively, the current findings indicate that early sensory processing can be modified by musical experience, which in turn can explain some of the variations in previous AV speech perception research.https://www.frontiersin.org/article/10.3389/fpsyg.2019.02562/fullspeech perceptionpredictionaudiovisualmusical trainingevent-related potential (ERP)inter-trial phase coherence (ITPC)
spellingShingle Marzieh Sorati
Dawn Marie Behne
Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
Frontiers in Psychology
speech perception
prediction
audiovisual
musical training
event-related potential (ERP)
inter-trial phase coherence (ITPC)
title Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
title_full Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
title_fullStr Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
title_full_unstemmed Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
title_short Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence
title_sort musical expertise affects audiovisual speech perception findings from event related potentials and inter trial phase coherence
topic speech perception
prediction
audiovisual
musical training
event-related potential (ERP)
inter-trial phase coherence (ITPC)
url https://www.frontiersin.org/article/10.3389/fpsyg.2019.02562/full
work_keys_str_mv AT marziehsorati musicalexpertiseaffectsaudiovisualspeechperceptionfindingsfromeventrelatedpotentialsandintertrialphasecoherence
AT dawnmariebehne musicalexpertiseaffectsaudiovisualspeechperceptionfindingsfromeventrelatedpotentialsandintertrialphasecoherence