Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision

Vision neuroscience has made great strides in understanding the hierarchical organization of object representations along the ventral visual stream (VVS). How VVS representations capture fine-grained visual similarities between objects that observers subjectively perceive has received limited examin...

Full description

Bibliographic Details
Main Authors: Kayla M Ferko, Anna Blumenthal, Chris B Martin, Daria Proklova, Alexander N Minos, Lisa M Saksida, Timothy J Bussey, Ali R Khan, Stefan Köhler
Format: Article
Language:English
Published: eLife Sciences Publications Ltd 2022-03-01
Series:eLife
Subjects:
Online Access:https://elifesciences.org/articles/66884
_version_ 1828108331018878976
author Kayla M Ferko
Anna Blumenthal
Chris B Martin
Daria Proklova
Alexander N Minos
Lisa M Saksida
Timothy J Bussey
Ali R Khan
Stefan Köhler
author_facet Kayla M Ferko
Anna Blumenthal
Chris B Martin
Daria Proklova
Alexander N Minos
Lisa M Saksida
Timothy J Bussey
Ali R Khan
Stefan Köhler
author_sort Kayla M Ferko
collection DOAJ
description Vision neuroscience has made great strides in understanding the hierarchical organization of object representations along the ventral visual stream (VVS). How VVS representations capture fine-grained visual similarities between objects that observers subjectively perceive has received limited examination so far. In the current study, we addressed this question by focussing on perceived visual similarities among subordinate exemplars of real-world categories. We hypothesized that these perceived similarities are reflected with highest fidelity in neural activity patterns downstream from inferotemporal regions, namely in perirhinal (PrC) and anterolateral entorhinal cortex (alErC) in the medial temporal lobe. To address this issue with functional magnetic resonance imaging (fMRI), we administered a modified 1-back task that required discrimination between category exemplars as well as categorization. Further, we obtained observer-specific ratings of perceived visual similarities, which predicted behavioural discrimination performance during scanning. As anticipated, we found that activity patterns in PrC and alErC predicted the structure of perceived visual similarity relationships among category exemplars, including its observer-specific component, with higher precision than any other VVS region. Our findings provide new evidence that subjective aspects of object perception that rely on fine-grained visual differentiation are reflected with highest fidelity in the medial temporal lobe.
first_indexed 2024-04-11T10:45:13Z
format Article
id doaj.art-1bdad062c5e94aaeb548a559f2ba2ca7
institution Directory Open Access Journal
issn 2050-084X
language English
last_indexed 2024-04-11T10:45:13Z
publishDate 2022-03-01
publisher eLife Sciences Publications Ltd
record_format Article
series eLife
spelling doaj.art-1bdad062c5e94aaeb548a559f2ba2ca72022-12-22T04:29:04ZengeLife Sciences Publications LtdeLife2050-084X2022-03-011110.7554/eLife.66884Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precisionKayla M Ferko0https://orcid.org/0000-0003-4362-7295Anna Blumenthal1https://orcid.org/0000-0002-8768-0189Chris B Martin2https://orcid.org/0000-0002-7014-4371Daria Proklova3Alexander N Minos4Lisa M Saksida5Timothy J Bussey6Ali R Khan7https://orcid.org/0000-0002-0760-8647Stefan Köhler8https://orcid.org/0000-0003-1905-6453Brain and Mind Institute, University of Western Ontario, London, Canada; Robarts Research Institute Schulich School of Medicine and Dentistry, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, Canada; Cervo Brain Research Center, University of Laval, Quebec, CanadaDepartment of Psychology, Florida State University, Tallahassee, United StatesBrain and Mind Institute, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, Canada; Robarts Research Institute Schulich School of Medicine and Dentistry, University of Western Ontario, London, Canada; Department of Physiology and Pharmacology, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, Canada; Robarts Research Institute Schulich School of Medicine and Dentistry, University of Western Ontario, London, Canada; Department of Physiology and Pharmacology, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, Canada; Robarts Research Institute Schulich School of Medicine and Dentistry, University of Western Ontario, London, Canada; School of Biomedical Engineering, University of Western Ontario, London, Canada; Department of Medical Biophysics, University of Western Ontario, London, CanadaBrain and Mind Institute, University of Western Ontario, London, Canada; Department of Psychology, University of Western Ontario, London, CanadaVision neuroscience has made great strides in understanding the hierarchical organization of object representations along the ventral visual stream (VVS). How VVS representations capture fine-grained visual similarities between objects that observers subjectively perceive has received limited examination so far. In the current study, we addressed this question by focussing on perceived visual similarities among subordinate exemplars of real-world categories. We hypothesized that these perceived similarities are reflected with highest fidelity in neural activity patterns downstream from inferotemporal regions, namely in perirhinal (PrC) and anterolateral entorhinal cortex (alErC) in the medial temporal lobe. To address this issue with functional magnetic resonance imaging (fMRI), we administered a modified 1-back task that required discrimination between category exemplars as well as categorization. Further, we obtained observer-specific ratings of perceived visual similarities, which predicted behavioural discrimination performance during scanning. As anticipated, we found that activity patterns in PrC and alErC predicted the structure of perceived visual similarity relationships among category exemplars, including its observer-specific component, with higher precision than any other VVS region. Our findings provide new evidence that subjective aspects of object perception that rely on fine-grained visual differentiation are reflected with highest fidelity in the medial temporal lobe.https://elifesciences.org/articles/66884medial temporal lobeventral visual pathwayvisual discriminationfMRIobject recognition
spellingShingle Kayla M Ferko
Anna Blumenthal
Chris B Martin
Daria Proklova
Alexander N Minos
Lisa M Saksida
Timothy J Bussey
Ali R Khan
Stefan Köhler
Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
eLife
medial temporal lobe
ventral visual pathway
visual discrimination
fMRI
object recognition
title Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
title_full Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
title_fullStr Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
title_full_unstemmed Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
title_short Activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
title_sort activity in perirhinal and entorhinal cortex predicts perceived visual similarities among category exemplars with highest precision
topic medial temporal lobe
ventral visual pathway
visual discrimination
fMRI
object recognition
url https://elifesciences.org/articles/66884
work_keys_str_mv AT kaylamferko activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT annablumenthal activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT chrisbmartin activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT dariaproklova activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT alexandernminos activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT lisamsaksida activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT timothyjbussey activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT alirkhan activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision
AT stefankohler activityinperirhinalandentorhinalcortexpredictsperceivedvisualsimilaritiesamongcategoryexemplarswithhighestprecision