The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception
Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the hapt...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2014-05-01
|
Series: | Frontiers in Psychology |
Subjects: | |
Online Access: | http://journal.frontiersin.org/Journal/10.3389/fpsyg.2014.00420/full |
_version_ | 1818678389420064768 |
---|---|
author | Avrill eTreille Coriandre eVilain Marc eSato |
author_facet | Avrill eTreille Coriandre eVilain Marc eSato |
author_sort | Avrill eTreille |
collection | DOAJ |
description | Recent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perception in live dyadic interactions between a listener and a speaker. In line with previous studies, auditory evoked potentials were attenuated and speeded up during both audio-haptic and audio-visual compared to auditory speech perception. Importantly, the observed latency and amplitude reduction did not significantly depend on the degree of visual and haptic recognition of the speech targets. Altogether, these results further demonstrate cross-modal interactions between the auditory, visual and haptic speech signals. Although they do not contradict the hypothesis that visual and haptic sensory inputs convey predictive information with respect to the incoming auditory speech input, these results suggest that, at least in live conversational interactions, systematic conclusions on sensory predictability in bimodal speech integration have to be taken with caution, with the extraction of predictive cues likely depending on the variability of the speech stimuli. |
first_indexed | 2024-12-17T09:14:29Z |
format | Article |
id | doaj.art-799f11e835354bbe957e0c04d7d941b1 |
institution | Directory Open Access Journal |
issn | 1664-1078 |
language | English |
last_indexed | 2024-12-17T09:14:29Z |
publishDate | 2014-05-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Psychology |
spelling | doaj.art-799f11e835354bbe957e0c04d7d941b12022-12-21T21:55:03ZengFrontiers Media S.A.Frontiers in Psychology1664-10782014-05-01510.3389/fpsyg.2014.0042088714The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perceptionAvrill eTreille0Coriandre eVilain1Marc eSato2GIPSA-lab, CNRS & Grenoble UniversityGIPSA-lab, CNRS & Grenoble UniversityGIPSA-lab, CNRS & Grenoble UniversityRecent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker’s face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perception in live dyadic interactions between a listener and a speaker. In line with previous studies, auditory evoked potentials were attenuated and speeded up during both audio-haptic and audio-visual compared to auditory speech perception. Importantly, the observed latency and amplitude reduction did not significantly depend on the degree of visual and haptic recognition of the speech targets. Altogether, these results further demonstrate cross-modal interactions between the auditory, visual and haptic speech signals. Although they do not contradict the hypothesis that visual and haptic sensory inputs convey predictive information with respect to the incoming auditory speech input, these results suggest that, at least in live conversational interactions, systematic conclusions on sensory predictability in bimodal speech integration have to be taken with caution, with the extraction of predictive cues likely depending on the variability of the speech stimuli.http://journal.frontiersin.org/Journal/10.3389/fpsyg.2014.00420/fullauditory evoked potentialMultisensory Interactionsaudio-visual speech perceptionEEG.audio-haptic speech perception |
spellingShingle | Avrill eTreille Coriandre eVilain Marc eSato The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception Frontiers in Psychology auditory evoked potential Multisensory Interactions audio-visual speech perception EEG. audio-haptic speech perception |
title | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception |
title_full | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception |
title_fullStr | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception |
title_full_unstemmed | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception |
title_short | The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception |
title_sort | sound of your lips electrophysiological cross modal interactions during hand to face and face to face speech perception |
topic | auditory evoked potential Multisensory Interactions audio-visual speech perception EEG. audio-haptic speech perception |
url | http://journal.frontiersin.org/Journal/10.3389/fpsyg.2014.00420/full |
work_keys_str_mv | AT avrilletreille thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception AT coriandreevilain thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception AT marcesato thesoundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception AT avrilletreille soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception AT coriandreevilain soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception AT marcesato soundofyourlipselectrophysiologicalcrossmodalinteractionsduringhandtofaceandfacetofacespeechperception |