Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval

How we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience o...

Full description

Bibliographic Details
Main Authors: Viorica Marian, Sayuri Hayakawa, Scott R. Schroeder
Format: Article
Language:English
Published: Frontiers Media S.A. 2021-07-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2021.661477/full
_version_ 1818454991467184128
author Viorica Marian
Sayuri Hayakawa
Scott R. Schroeder
Scott R. Schroeder
author_facet Viorica Marian
Sayuri Hayakawa
Scott R. Schroeder
Scott R. Schroeder
author_sort Viorica Marian
collection DOAJ
description How we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience on auditory perception), as well as across modalities (e.g., effects of visual feedback on sound localization). Here, we demonstrate that experience with different types of auditory input (spoken words vs. environmental sounds) modulates how humans remember concurrently-presented visual objects. Participants viewed a series of line drawings (e.g., picture of a cat) displayed in one of four quadrants while listening to a word or sound that was congruent (e.g., “cat” or <meow>), incongruent (e.g., “motorcycle” or <vroom–vroom>), or neutral (e.g., a meaningless pseudoword or a tonal beep) relative to the picture. Following the encoding phase, participants were presented with the original drawings plus new drawings and asked to indicate whether each one was “old” or “new.” If a drawing was designated as “old,” participants then reported where it had been displayed. We find that words and sounds both elicit more accurate memory for what objects were previously seen, but only congruent environmental sounds enhance memory for where objects were positioned – this, despite the fact that the auditory stimuli were not meaningful spatial cues of the objects’ locations on the screen. Given that during real-world listening conditions, environmental sounds, but not words, reliably originate from the location of their referents, listening to sounds may attune the visual dorsal pathway to facilitate attention and memory for objects’ locations. We propose that audio-visual associations in the environment and in our previous experience jointly contribute to visual memory, strengthening visual memory through exposure to auditory input.
first_indexed 2024-12-14T22:03:40Z
format Article
id doaj.art-dcfbe310991646539e44a66c5ecce339
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-12-14T22:03:40Z
publishDate 2021-07-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-dcfbe310991646539e44a66c5ecce3392022-12-21T22:45:55ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2021-07-011510.3389/fnins.2021.661477661477Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory RetrievalViorica Marian0Sayuri Hayakawa1Scott R. Schroeder2Scott R. Schroeder3Department of Communication Sciences and Disorders, Northwestern University, Evanston, IL, United StatesDepartment of Communication Sciences and Disorders, Northwestern University, Evanston, IL, United StatesDepartment of Communication Sciences and Disorders, Northwestern University, Evanston, IL, United StatesDepartment of Speech-Language-Hearing Sciences, Hofstra University, Hempstead, NY, United StatesHow we perceive and learn about our environment is influenced by our prior experiences and existing representations of the world. Top-down cognitive processes, such as attention and expectations, can alter how we process sensory stimuli, both within a modality (e.g., effects of auditory experience on auditory perception), as well as across modalities (e.g., effects of visual feedback on sound localization). Here, we demonstrate that experience with different types of auditory input (spoken words vs. environmental sounds) modulates how humans remember concurrently-presented visual objects. Participants viewed a series of line drawings (e.g., picture of a cat) displayed in one of four quadrants while listening to a word or sound that was congruent (e.g., “cat” or <meow>), incongruent (e.g., “motorcycle” or <vroom–vroom>), or neutral (e.g., a meaningless pseudoword or a tonal beep) relative to the picture. Following the encoding phase, participants were presented with the original drawings plus new drawings and asked to indicate whether each one was “old” or “new.” If a drawing was designated as “old,” participants then reported where it had been displayed. We find that words and sounds both elicit more accurate memory for what objects were previously seen, but only congruent environmental sounds enhance memory for where objects were positioned – this, despite the fact that the auditory stimuli were not meaningful spatial cues of the objects’ locations on the screen. Given that during real-world listening conditions, environmental sounds, but not words, reliably originate from the location of their referents, listening to sounds may attune the visual dorsal pathway to facilitate attention and memory for objects’ locations. We propose that audio-visual associations in the environment and in our previous experience jointly contribute to visual memory, strengthening visual memory through exposure to auditory input.https://www.frontiersin.org/articles/10.3389/fnins.2021.661477/fullmultisensory integrationcross-modal interactionaudio-visual processingauditory experiencevisual memoryspatial memory
spellingShingle Viorica Marian
Sayuri Hayakawa
Scott R. Schroeder
Scott R. Schroeder
Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
Frontiers in Neuroscience
multisensory integration
cross-modal interaction
audio-visual processing
auditory experience
visual memory
spatial memory
title Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_full Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_fullStr Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_full_unstemmed Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_short Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval
title_sort cross modal interaction between auditory and visual input impacts memory retrieval
topic multisensory integration
cross-modal interaction
audio-visual processing
auditory experience
visual memory
spatial memory
url https://www.frontiersin.org/articles/10.3389/fnins.2021.661477/full
work_keys_str_mv AT vioricamarian crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval
AT sayurihayakawa crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval
AT scottrschroeder crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval
AT scottrschroeder crossmodalinteractionbetweenauditoryandvisualinputimpactsmemoryretrieval