Audiotactile multisensory interactions in human information processing

The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence...

Description complète

Détails bibliographiques
Auteurs principaux: Kitagawa, N, Spence, C
Format: Journal article
Langue:English
Publié: 2006
_version_ 1826260159212552192
author Kitagawa, N
Spence, C
author_facet Kitagawa, N
Spence, C
author_sort Kitagawa, N
collection OXFORD
description The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially-modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non-spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area. © 2006 Japanese Psychological Association.
first_indexed 2024-03-06T19:01:13Z
format Journal article
id oxford-uuid:139650d3-44dd-4a7d-bfd7-ed58caff8c0f
institution University of Oxford
language English
last_indexed 2024-03-06T19:01:13Z
publishDate 2006
record_format dspace
spelling oxford-uuid:139650d3-44dd-4a7d-bfd7-ed58caff8c0f2022-03-26T10:14:48ZAudiotactile multisensory interactions in human information processingJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:139650d3-44dd-4a7d-bfd7-ed58caff8c0fEnglishSymplectic Elements at Oxford2006Kitagawa, NSpence, CThe last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially-modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non-spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area. © 2006 Japanese Psychological Association.
spellingShingle Kitagawa, N
Spence, C
Audiotactile multisensory interactions in human information processing
title Audiotactile multisensory interactions in human information processing
title_full Audiotactile multisensory interactions in human information processing
title_fullStr Audiotactile multisensory interactions in human information processing
title_full_unstemmed Audiotactile multisensory interactions in human information processing
title_short Audiotactile multisensory interactions in human information processing
title_sort audiotactile multisensory interactions in human information processing
work_keys_str_mv AT kitagawan audiotactilemultisensoryinteractionsinhumaninformationprocessing
AT spencec audiotactilemultisensoryinteractionsinhumaninformationprocessing