Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements

Combined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in tur...

Full description

Bibliographic Details
Main Authors: Patrick David Schelenz, Martin eKlasen, Barbara eReese, Christina eRegenbogen, Dhana eWolf, Yutaka eKato, Klaus eMathiak
Format: Article
Language:English
Published: Frontiers Media S.A. 2013-11-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:http://journal.frontiersin.org/Journal/10.3389/fnhum.2013.00729/full
_version_ 1818875401452126208
author Patrick David Schelenz
Patrick David Schelenz
Martin eKlasen
Martin eKlasen
Barbara eReese
Barbara eReese
Christina eRegenbogen
Christina eRegenbogen
Dhana eWolf
Dhana eWolf
Yutaka eKato
Yutaka eKato
Klaus eMathiak
Klaus eMathiak
Klaus eMathiak
author_facet Patrick David Schelenz
Patrick David Schelenz
Martin eKlasen
Martin eKlasen
Barbara eReese
Barbara eReese
Christina eRegenbogen
Christina eRegenbogen
Dhana eWolf
Dhana eWolf
Yutaka eKato
Yutaka eKato
Klaus eMathiak
Klaus eMathiak
Klaus eMathiak
author_sort Patrick David Schelenz
collection DOAJ
description Combined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in turn, depends on a priori knowledge of the signal topography. Complex designs such as studies on multisensory integration of emotions investigate subtle differences in distributed networks based on only a few trials per condition. Thus, they require a sensitive and comprehensive approach which does not rely on a-priori knowledge about the underlying neural processes. In this pilot study, feasibility and sensitivity of source localization-driven analysis for EEG-fMRI was tested using a multisensory integration paradigm. Dynamic audiovisual stimuli consisting of emotional talking faces and pseudowords with emotional prosody were rated in a delayed response task. The trials comprised affectively congruent and incongruent displays.In addition to event-locked EEG and fMRI analyses, induced oscillatory EEG responses at estimated cortical sources and in specific temporo-spectral windows were correlated with the corresponding BOLD responses. EEG analysis showed high data quality with less than 10% trial rejection. In an early time window, alpha oscillations were suppressed in bilateral occipital cortices and fMRI analysis confirmed high data quality with reliable activation in auditory, visual and frontal areas to the presentation of multisensory stimuli. In line with previous studies, we obtained reliable correlation patterns for event locked occipital alpha suppression and BOLD signal time course.Our results suggest a valid methodological approach to investigate complex stimuli using the present source localization driven method for EEG-fMRI. This novel procedure may help to investigate combined EEG-fMRI data from novel complex paradigms with high spatial and temporal resolution.
first_indexed 2024-12-19T13:25:55Z
format Article
id doaj.art-6a4e42f4f996403883c1a04bb9c5dcb8
institution Directory Open Access Journal
issn 1662-5161
language English
last_indexed 2024-12-19T13:25:55Z
publishDate 2013-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Human Neuroscience
spelling doaj.art-6a4e42f4f996403883c1a04bb9c5dcb82022-12-21T20:19:33ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612013-11-01710.3389/fnhum.2013.0072957587Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurementsPatrick David Schelenz0Patrick David Schelenz1Martin eKlasen2Martin eKlasen3Barbara eReese4Barbara eReese5Christina eRegenbogen6Christina eRegenbogen7Dhana eWolf8Dhana eWolf9Yutaka eKato10Yutaka eKato11Klaus eMathiak12Klaus eMathiak13Klaus eMathiak14University Hospital AachenJARA-Translational Brain MedicineUniversity Hospital AachenJARA-Translational Brain MedicineUniversity Hospital AachenJARA-Translational Brain MedicineUniversity Hospital AachenJARA-Translational Brain MedicineUniversity Hospital AachenJARA-Translational Brain MedicineUniversity Hospital AachenKeio University School of MedicineUniversity Hospital AachenJARA-Translational Brain MedicineResearch Center JülichCombined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in turn, depends on a priori knowledge of the signal topography. Complex designs such as studies on multisensory integration of emotions investigate subtle differences in distributed networks based on only a few trials per condition. Thus, they require a sensitive and comprehensive approach which does not rely on a-priori knowledge about the underlying neural processes. In this pilot study, feasibility and sensitivity of source localization-driven analysis for EEG-fMRI was tested using a multisensory integration paradigm. Dynamic audiovisual stimuli consisting of emotional talking faces and pseudowords with emotional prosody were rated in a delayed response task. The trials comprised affectively congruent and incongruent displays.In addition to event-locked EEG and fMRI analyses, induced oscillatory EEG responses at estimated cortical sources and in specific temporo-spectral windows were correlated with the corresponding BOLD responses. EEG analysis showed high data quality with less than 10% trial rejection. In an early time window, alpha oscillations were suppressed in bilateral occipital cortices and fMRI analysis confirmed high data quality with reliable activation in auditory, visual and frontal areas to the presentation of multisensory stimuli. In line with previous studies, we obtained reliable correlation patterns for event locked occipital alpha suppression and BOLD signal time course.Our results suggest a valid methodological approach to investigate complex stimuli using the present source localization driven method for EEG-fMRI. This novel procedure may help to investigate combined EEG-fMRI data from novel complex paradigms with high spatial and temporal resolution.http://journal.frontiersin.org/Journal/10.3389/fnhum.2013.00729/fullAudiovisual integrationAffective NeuroscienceEEG-fMRIperceptual processingemotion reemotion integration
spellingShingle Patrick David Schelenz
Patrick David Schelenz
Martin eKlasen
Martin eKlasen
Barbara eReese
Barbara eReese
Christina eRegenbogen
Christina eRegenbogen
Dhana eWolf
Dhana eWolf
Yutaka eKato
Yutaka eKato
Klaus eMathiak
Klaus eMathiak
Klaus eMathiak
Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
Frontiers in Human Neuroscience
Audiovisual integration
Affective Neuroscience
EEG-fMRI
perceptual processing
emotion re
emotion integration
title Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
title_full Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
title_fullStr Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
title_full_unstemmed Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
title_short Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements
title_sort multisensory integration of dynamic emotional faces and voices method for simultaneous eeg fmri measurements
topic Audiovisual integration
Affective Neuroscience
EEG-fMRI
perceptual processing
emotion re
emotion integration
url http://journal.frontiersin.org/Journal/10.3389/fnhum.2013.00729/full
work_keys_str_mv AT patrickdavidschelenz multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT patrickdavidschelenz multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT martineklasen multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT martineklasen multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT barbaraereese multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT barbaraereese multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT christinaeregenbogen multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT christinaeregenbogen multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT dhanaewolf multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT dhanaewolf multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT yutakaekato multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT yutakaekato multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT klausemathiak multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT klausemathiak multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements
AT klausemathiak multisensoryintegrationofdynamicemotionalfacesandvoicesmethodforsimultaneouseegfmrimeasurements