Comparing audiovisual semantic interactions between linguistic and non-linguistic stimuli

We examined the time-courses of the crossmodal semantic congruency effects elicited by naturalistic sounds or spoken words on the processing of visual pictures and printed words. Auditory primes were presented at seven stimulus onset asynchronies (SOAs) with respect to the visual targets, ranging fr...

詳細記述

書誌詳細
主要な著者: Chen, Y, Spence, C
フォーマット: Conference item
出版事項: 2017
その他の書誌記述
要約:We examined the time-courses of the crossmodal semantic congruency effects elicited by naturalistic sounds or spoken words on the processing of visual pictures and printed words. Auditory primes were presented at seven stimulus onset asynchronies (SOAs) with respect to the visual targets, ranging from auditory leading by 1000 ms to auditory lagging by 250 ms.