Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals

Abstract It is a widely held assumption that the brain performs perceptual inference by combining sensory information with prior expectations, weighted by their uncertainty. A distinction can be made between higher- and lower-level priors, which can be manipulated with associative learning and senso...

Full description

Bibliographic Details
Main Authors: Zsófia Pálffy, Kinga Farkas, Gábor Csukly, Szabolcs Kéri, Bertalan Polner
Format: Article
Language:English
Published: Nature Portfolio 2021-08-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-021-96198-7
_version_ 1818840456306360320
author Zsófia Pálffy
Kinga Farkas
Gábor Csukly
Szabolcs Kéri
Bertalan Polner
author_facet Zsófia Pálffy
Kinga Farkas
Gábor Csukly
Szabolcs Kéri
Bertalan Polner
author_sort Zsófia Pálffy
collection DOAJ
description Abstract It is a widely held assumption that the brain performs perceptual inference by combining sensory information with prior expectations, weighted by their uncertainty. A distinction can be made between higher- and lower-level priors, which can be manipulated with associative learning and sensory priming, respectively. Here, we simultaneously investigate priming and the differential effect of auditory vs. visual associative cues on visual perception, and we also examine the reliability of individual differences. Healthy individuals (N = 29) performed a perceptual inference task twice with a one-week delay. They reported the perceived direction of motion of dot pairs, which were preceded by a probabilistic visuo-acoustic cue. In 30% of the trials, motion direction was ambiguous, and in half of these trials, the auditory versus the visual cue predicted opposing directions. Cue-stimulus contingency could change every 40 trials. On ambiguous trials where the visual and the auditory cue predicted conflicting directions of motion, participants made more decisions consistent with the prediction of the acoustic cue. Increased predictive processing under stimulus uncertainty was indicated by slower responses to ambiguous (vs. non-ambiguous) stimuli. Furthermore, priming effects were also observed in that perception of ambiguous stimuli was influenced by perceptual decisions on the previous ambiguous and unambiguous trials as well. Critically, behavioural effects had substantial inter-individual variability which showed high test–retest reliability (intraclass correlation coefficient (ICC) > 0.78). Overall, higher-level priors based on auditory (vs. visual) information had greater influence on visual perception, and lower-level priors were also in action. Importantly, we observed large and stable differences in various aspects of task performance. Computational modelling combined with neuroimaging could allow testing hypotheses regarding the potential mechanisms causing these behavioral effects. The reliability of the behavioural differences implicates that such perceptual inference tasks could be valuable tools during large-scale biomarker and neuroimaging studies.
first_indexed 2024-12-19T04:10:28Z
format Article
id doaj.art-68a2c85578e749ca9167ec732d2fea8c
institution Directory Open Access Journal
issn 2045-2322
language English
last_indexed 2024-12-19T04:10:28Z
publishDate 2021-08-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj.art-68a2c85578e749ca9167ec732d2fea8c2022-12-21T20:36:25ZengNature PortfolioScientific Reports2045-23222021-08-0111111610.1038/s41598-021-96198-7Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individualsZsófia Pálffy0Kinga Farkas1Gábor Csukly2Szabolcs Kéri3Bertalan Polner4Department of Cognitive Science, Budapest University of Technology and EconomicsDepartment of Psychiatry and Psychotherapy, Semmelweis UniversityDepartment of Psychiatry and Psychotherapy, Semmelweis UniversityDepartment of Cognitive Science, Budapest University of Technology and EconomicsDepartment of Cognitive Science, Budapest University of Technology and EconomicsAbstract It is a widely held assumption that the brain performs perceptual inference by combining sensory information with prior expectations, weighted by their uncertainty. A distinction can be made between higher- and lower-level priors, which can be manipulated with associative learning and sensory priming, respectively. Here, we simultaneously investigate priming and the differential effect of auditory vs. visual associative cues on visual perception, and we also examine the reliability of individual differences. Healthy individuals (N = 29) performed a perceptual inference task twice with a one-week delay. They reported the perceived direction of motion of dot pairs, which were preceded by a probabilistic visuo-acoustic cue. In 30% of the trials, motion direction was ambiguous, and in half of these trials, the auditory versus the visual cue predicted opposing directions. Cue-stimulus contingency could change every 40 trials. On ambiguous trials where the visual and the auditory cue predicted conflicting directions of motion, participants made more decisions consistent with the prediction of the acoustic cue. Increased predictive processing under stimulus uncertainty was indicated by slower responses to ambiguous (vs. non-ambiguous) stimuli. Furthermore, priming effects were also observed in that perception of ambiguous stimuli was influenced by perceptual decisions on the previous ambiguous and unambiguous trials as well. Critically, behavioural effects had substantial inter-individual variability which showed high test–retest reliability (intraclass correlation coefficient (ICC) > 0.78). Overall, higher-level priors based on auditory (vs. visual) information had greater influence on visual perception, and lower-level priors were also in action. Importantly, we observed large and stable differences in various aspects of task performance. Computational modelling combined with neuroimaging could allow testing hypotheses regarding the potential mechanisms causing these behavioral effects. The reliability of the behavioural differences implicates that such perceptual inference tasks could be valuable tools during large-scale biomarker and neuroimaging studies.https://doi.org/10.1038/s41598-021-96198-7
spellingShingle Zsófia Pálffy
Kinga Farkas
Gábor Csukly
Szabolcs Kéri
Bertalan Polner
Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
Scientific Reports
title Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
title_full Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
title_fullStr Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
title_full_unstemmed Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
title_short Cross-modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
title_sort cross modal auditory priors drive the perception of bistable visual stimuli with reliable differences between individuals
url https://doi.org/10.1038/s41598-021-96198-7
work_keys_str_mv AT zsofiapalffy crossmodalauditorypriorsdrivetheperceptionofbistablevisualstimuliwithreliabledifferencesbetweenindividuals
AT kingafarkas crossmodalauditorypriorsdrivetheperceptionofbistablevisualstimuliwithreliabledifferencesbetweenindividuals
AT gaborcsukly crossmodalauditorypriorsdrivetheperceptionofbistablevisualstimuliwithreliabledifferencesbetweenindividuals
AT szabolcskeri crossmodalauditorypriorsdrivetheperceptionofbistablevisualstimuliwithreliabledifferencesbetweenindividuals
AT bertalanpolner crossmodalauditorypriorsdrivetheperceptionofbistablevisualstimuliwithreliabledifferencesbetweenindividuals