Social-affective features drive human representations of observed actions

Humans observe actions performed by others in many different visual and social settings. What features do we extract and attend when we view such complex scenes, and how are they processed in the brain? To answer these questions, we curated two large-scale sets of naturalistic videos of everyday act...

Full description

Bibliographic Details
Main Authors: Diana C Dima, Tyler M Tomita, Christopher J Honey, Leyla Isik
Format: Article
Language:English
Published: eLife Sciences Publications Ltd 2022-05-01
Series:eLife
Subjects:
Online Access:https://elifesciences.org/articles/75027
_version_ 1828203441409753088
author Diana C Dima
Tyler M Tomita
Christopher J Honey
Leyla Isik
author_facet Diana C Dima
Tyler M Tomita
Christopher J Honey
Leyla Isik
author_sort Diana C Dima
collection DOAJ
description Humans observe actions performed by others in many different visual and social settings. What features do we extract and attend when we view such complex scenes, and how are they processed in the brain? To answer these questions, we curated two large-scale sets of naturalistic videos of everyday actions and estimated their perceived similarity in two behavioral experiments. We normed and quantified a large range of visual, action-related, and social-affective features across the stimulus sets. Using a cross-validated variance partitioning analysis, we found that social-affective features predicted similarity judgments better than, and independently of, visual and action features in both behavioral experiments. Next, we conducted an electroencephalography experiment, which revealed a sustained correlation between neural responses to videos and their behavioral similarity. Visual, action, and social-affective features predicted neural patterns at early, intermediate, and late stages, respectively, during this behaviorally relevant time window. Together, these findings show that social-affective features are important for perceiving naturalistic actions and are extracted at the final stage of a temporal gradient in the brain.
first_indexed 2024-04-12T12:05:57Z
format Article
id doaj.art-82b2ac2457cd4853930bac5589dcd59f
institution Directory Open Access Journal
issn 2050-084X
language English
last_indexed 2024-04-12T12:05:57Z
publishDate 2022-05-01
publisher eLife Sciences Publications Ltd
record_format Article
series eLife
spelling doaj.art-82b2ac2457cd4853930bac5589dcd59f2022-12-22T03:33:43ZengeLife Sciences Publications LtdeLife2050-084X2022-05-011110.7554/eLife.75027Social-affective features drive human representations of observed actionsDiana C Dima0https://orcid.org/0000-0002-9612-5574Tyler M Tomita1Christopher J Honey2https://orcid.org/0000-0002-0745-5089Leyla Isik3Department of Cognitive Science, Johns Hopkins University, Baltimore, United StatesDepartment of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, United StatesDepartment of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, United StatesDepartment of Cognitive Science, Johns Hopkins University, Baltimore, United StatesHumans observe actions performed by others in many different visual and social settings. What features do we extract and attend when we view such complex scenes, and how are they processed in the brain? To answer these questions, we curated two large-scale sets of naturalistic videos of everyday actions and estimated their perceived similarity in two behavioral experiments. We normed and quantified a large range of visual, action-related, and social-affective features across the stimulus sets. Using a cross-validated variance partitioning analysis, we found that social-affective features predicted similarity judgments better than, and independently of, visual and action features in both behavioral experiments. Next, we conducted an electroencephalography experiment, which revealed a sustained correlation between neural responses to videos and their behavioral similarity. Visual, action, and social-affective features predicted neural patterns at early, intermediate, and late stages, respectively, during this behaviorally relevant time window. Together, these findings show that social-affective features are important for perceiving naturalistic actions and are extracted at the final stage of a temporal gradient in the brain.https://elifesciences.org/articles/75027action perceptiontemporal dynamicsbehavioral similarity
spellingShingle Diana C Dima
Tyler M Tomita
Christopher J Honey
Leyla Isik
Social-affective features drive human representations of observed actions
eLife
action perception
temporal dynamics
behavioral similarity
title Social-affective features drive human representations of observed actions
title_full Social-affective features drive human representations of observed actions
title_fullStr Social-affective features drive human representations of observed actions
title_full_unstemmed Social-affective features drive human representations of observed actions
title_short Social-affective features drive human representations of observed actions
title_sort social affective features drive human representations of observed actions
topic action perception
temporal dynamics
behavioral similarity
url https://elifesciences.org/articles/75027
work_keys_str_mv AT dianacdima socialaffectivefeaturesdrivehumanrepresentationsofobservedactions
AT tylermtomita socialaffectivefeaturesdrivehumanrepresentationsofobservedactions
AT christopherjhoney socialaffectivefeaturesdrivehumanrepresentationsofobservedactions
AT leylaisik socialaffectivefeaturesdrivehumanrepresentationsofobservedactions