Quantifying dynamic facial expressions under naturalistic conditions
Facial affect is expressed dynamically – a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorder...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
eLife Sciences Publications Ltd
2022-08-01
|
Series: | eLife |
Subjects: | |
Online Access: | https://elifesciences.org/articles/79581 |
_version_ | 1818027451541880832 |
---|---|
author | Jayson Jeganathan Megan Campbell Matthew Hyett Gordon Parker Michael Breakspear |
author_facet | Jayson Jeganathan Megan Campbell Matthew Hyett Gordon Parker Michael Breakspear |
author_sort | Jayson Jeganathan |
collection | DOAJ |
description | Facial affect is expressed dynamically – a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorders. Using the latest in machine vision and systems modelling, we studied dynamic facial expressions of people viewing emotionally salient film clips. We found that the apparent complexity of dynamic facial expressions can be captured by a small number of simple spatiotemporal states – composites of distinct facial actions, each expressed with a unique spectral fingerprint. Sequential expression of these states is common across individuals viewing the same film stimuli but varies in those with the melancholic subtype of major depressive disorder. This approach provides a platform for translational research, capturing dynamic facial expressions under naturalistic conditions and enabling new quantitative tools for the study of affective disorders and related mental illnesses. |
first_indexed | 2024-12-10T04:48:07Z |
format | Article |
id | doaj.art-b99d561404864a21929f5c49e3df5cb9 |
institution | Directory Open Access Journal |
issn | 2050-084X |
language | English |
last_indexed | 2024-12-10T04:48:07Z |
publishDate | 2022-08-01 |
publisher | eLife Sciences Publications Ltd |
record_format | Article |
series | eLife |
spelling | doaj.art-b99d561404864a21929f5c49e3df5cb92022-12-22T02:01:41ZengeLife Sciences Publications LtdeLife2050-084X2022-08-011110.7554/eLife.79581Quantifying dynamic facial expressions under naturalistic conditionsJayson Jeganathan0https://orcid.org/0000-0003-4175-918XMegan Campbell1https://orcid.org/0000-0003-4051-1529Matthew Hyett2Gordon Parker3Michael Breakspear4https://orcid.org/0000-0003-4943-3969School of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, AustraliaSchool of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, AustraliaSchool of Psychological Sciences, University of Western Australia, Perth, AustraliaSchool of Psychiatry, University of New South Wales, Kensington, AustraliaSchool of Psychology, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, Australia; Hunter Medical Research Institute, Newcastle, Australia; School of Medicine and Public Health, College of Medicine, Health and Wellbeing, University of Newcastle, Newcastle, AustraliaFacial affect is expressed dynamically – a giggle, grimace, or an agitated frown. However, the characterisation of human affect has relied almost exclusively on static images. This approach cannot capture the nuances of human communication or support the naturalistic assessment of affective disorders. Using the latest in machine vision and systems modelling, we studied dynamic facial expressions of people viewing emotionally salient film clips. We found that the apparent complexity of dynamic facial expressions can be captured by a small number of simple spatiotemporal states – composites of distinct facial actions, each expressed with a unique spectral fingerprint. Sequential expression of these states is common across individuals viewing the same film stimuli but varies in those with the melancholic subtype of major depressive disorder. This approach provides a platform for translational research, capturing dynamic facial expressions under naturalistic conditions and enabling new quantitative tools for the study of affective disorders and related mental illnesses.https://elifesciences.org/articles/79581facial expressionmajor depressive disordernaturalistic |
spellingShingle | Jayson Jeganathan Megan Campbell Matthew Hyett Gordon Parker Michael Breakspear Quantifying dynamic facial expressions under naturalistic conditions eLife facial expression major depressive disorder naturalistic |
title | Quantifying dynamic facial expressions under naturalistic conditions |
title_full | Quantifying dynamic facial expressions under naturalistic conditions |
title_fullStr | Quantifying dynamic facial expressions under naturalistic conditions |
title_full_unstemmed | Quantifying dynamic facial expressions under naturalistic conditions |
title_short | Quantifying dynamic facial expressions under naturalistic conditions |
title_sort | quantifying dynamic facial expressions under naturalistic conditions |
topic | facial expression major depressive disorder naturalistic |
url | https://elifesciences.org/articles/79581 |
work_keys_str_mv | AT jaysonjeganathan quantifyingdynamicfacialexpressionsundernaturalisticconditions AT megancampbell quantifyingdynamicfacialexpressionsundernaturalisticconditions AT matthewhyett quantifyingdynamicfacialexpressionsundernaturalisticconditions AT gordonparker quantifyingdynamicfacialexpressionsundernaturalisticconditions AT michaelbreakspear quantifyingdynamicfacialexpressionsundernaturalisticconditions |