Spatial and temporal factors during processing of audiovisual speech: a PET study.

Speech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisu...

Full description

Bibliographic Details
Main Authors: Macaluso, E, George, N, Dolan, R, Spence, C, Driver, J
Format: Journal article
Language:English
Published: 2004
_version_ 1826273621765521408
author Macaluso, E
George, N
Dolan, R
Spence, C
Driver, J
author_facet Macaluso, E
George, N
Dolan, R
Spence, C
Driver, J
author_sort Macaluso, E
collection OXFORD
description Speech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisual speech focused primarily on just temporal aspects. Here we used Positron Emission Tomography (PET) during audiovisual speech processing to study how temporal and spatial factors might jointly affect brain activations. In agreement with previous work, synchronous versus asynchronous audiovisual speech yielded increased activity in multisensory association areas (e.g., superior temporal sulcus [STS]), plus in some unimodal visual areas. Our orthogonal manipulation of relative stimulus position (auditory and visual stimuli presented at same location vs. opposite sides) and stimulus synchrony showed that (i) ventral occipital areas and superior temporal sulcus were unaffected by relative location; (ii) lateral and dorsal occipital areas were selectively activated for synchronous bimodal stimulation at the same external location; (iii) right inferior parietal lobule was activated for synchronous auditory and visual stimuli at different locations, that is, in the condition classically associated with the 'ventriloquism effect' (shift of perceived auditory position toward the visual location). Thus, different brain regions are involved in different aspects of audiovisual integration. While ventral areas appear more affected by audiovisual synchrony (which can influence speech identification), more dorsal areas appear to be associated with spatial multisensory interactions.
first_indexed 2024-03-06T22:31:00Z
format Journal article
id oxford-uuid:58441c1b-87aa-4e7e-9580-28f126a9dbb7
institution University of Oxford
language English
last_indexed 2024-03-06T22:31:00Z
publishDate 2004
record_format dspace
spelling oxford-uuid:58441c1b-87aa-4e7e-9580-28f126a9dbb72022-03-26T17:02:14ZSpatial and temporal factors during processing of audiovisual speech: a PET study.Journal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:58441c1b-87aa-4e7e-9580-28f126a9dbb7EnglishSymplectic Elements at Oxford2004Macaluso, EGeorge, NDolan, RSpence, CDriver, JSpeech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisual speech focused primarily on just temporal aspects. Here we used Positron Emission Tomography (PET) during audiovisual speech processing to study how temporal and spatial factors might jointly affect brain activations. In agreement with previous work, synchronous versus asynchronous audiovisual speech yielded increased activity in multisensory association areas (e.g., superior temporal sulcus [STS]), plus in some unimodal visual areas. Our orthogonal manipulation of relative stimulus position (auditory and visual stimuli presented at same location vs. opposite sides) and stimulus synchrony showed that (i) ventral occipital areas and superior temporal sulcus were unaffected by relative location; (ii) lateral and dorsal occipital areas were selectively activated for synchronous bimodal stimulation at the same external location; (iii) right inferior parietal lobule was activated for synchronous auditory and visual stimuli at different locations, that is, in the condition classically associated with the 'ventriloquism effect' (shift of perceived auditory position toward the visual location). Thus, different brain regions are involved in different aspects of audiovisual integration. While ventral areas appear more affected by audiovisual synchrony (which can influence speech identification), more dorsal areas appear to be associated with spatial multisensory interactions.
spellingShingle Macaluso, E
George, N
Dolan, R
Spence, C
Driver, J
Spatial and temporal factors during processing of audiovisual speech: a PET study.
title Spatial and temporal factors during processing of audiovisual speech: a PET study.
title_full Spatial and temporal factors during processing of audiovisual speech: a PET study.
title_fullStr Spatial and temporal factors during processing of audiovisual speech: a PET study.
title_full_unstemmed Spatial and temporal factors during processing of audiovisual speech: a PET study.
title_short Spatial and temporal factors during processing of audiovisual speech: a PET study.
title_sort spatial and temporal factors during processing of audiovisual speech a pet study
work_keys_str_mv AT macalusoe spatialandtemporalfactorsduringprocessingofaudiovisualspeechapetstudy
AT georgen spatialandtemporalfactorsduringprocessingofaudiovisualspeechapetstudy
AT dolanr spatialandtemporalfactorsduringprocessingofaudiovisualspeechapetstudy
AT spencec spatialandtemporalfactorsduringprocessingofaudiovisualspeechapetstudy
AT driverj spatialandtemporalfactorsduringprocessingofaudiovisualspeechapetstudy