Visual features are processed before navigational affordances in the human brain

Abstract To navigate through their immediate environment humans process scene information rapidly. How does the cascade of neural processing elicited by scene viewing to facilitate navigational planning unfold over time? To investigate, we recorded human brain responses to visual scenes with electro...

Full description

Bibliographic Details
Main Authors: Kshitij Dwivedi, Sari Sadiya, Marta P. Balode, Gemma Roig, Radoslaw M. Cichy
Format: Article
Language:English
Published: Nature Portfolio 2024-03-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-024-55652-y
Description
Summary:Abstract To navigate through their immediate environment humans process scene information rapidly. How does the cascade of neural processing elicited by scene viewing to facilitate navigational planning unfold over time? To investigate, we recorded human brain responses to visual scenes with electroencephalography and related those to computational models that operationalize three aspects of scene processing (2D, 3D, and semantic information), as well as to a behavioral model capturing navigational affordances. We found a temporal processing hierarchy: navigational affordance is processed later than the other scene features (2D, 3D, and semantic) investigated. This reveals the temporal order with which the human brain computes complex scene information and suggests that the brain leverages these pieces of information to plan navigation.
ISSN:2045-2322