Streszczenie: | <p>Humans have evolved to fuse information across senses by weighing the reliability of each modality (e.g., a viewer relies more on the picture to follow a drama if the TV audio is faulty; Ernst & Banks, 2002) as well as to infer unseen causes of sensory events by inverting internal generative models of the world (causal inference or CI: “Was the film dubbed or not?”). These multisensory strategies resemble Bayesian inference, but the mechanism whereby the inference unfolds step-by-step in a biologically plausible fashion remains poorly understood. Moreover, the reliability-weighted fusion and CI strategies have complementary costs and benefits, but sometimes seem to describe human choices in experimental conditions that presumably only satisfy the assumption of each particular strategy (e.g., De Winkel, Katliar, & Bülthoff, 2015; Meijer, Veselič, Calafiore, & Noppeney, 2019).</p> <p>In this thesis, I present converging evidence, from a series of human neuroimaging and psychophysics studies, suggesting that these rival accounts of multisensory inference can be reconciled by a time-resolved mechanism: The brain rapidly derives a fused sensory estimate for computational expediency and, later and if required, filtering out irrelevant signals based on the inferred sensory cause(s).</p> <p>First, analysing time- and source-resolved human magnetoencephalographic data (<b>chapter 2</b>), I unveil a systematic spatiotemporal cascade of the relevant computations for multisensory inference, starting with early segregated unisensory representations, continuing with sensory fusion in parietal-temporal regions, and culminating as CI in the frontal lobe. This suggests that the distinct computations required for flexible multisensory behaviour (fusion and CI) coexist, but each dominates at different times and in distinct regions (<b>chapter 3</b>).</p> <p>Furthermore, using behavioural paradigms such as speeded-judgement under deadline-induced time pressure as well as psychophysics leveraging sensory masking (<b>chapter 4</b>), I explicitly quantified how the unfolding of neural representations can be read out to drive behaviour. Collectively, slower versus faster behavioural choices reflect more of CI versus sensory fusion, respectively, indicating that the mapping from neural representational activity to behavioural choice corroborates the temporal hierarchy revealed in the neural data.</p>
|