Prediction of Continuous Emotional Measures through Physiological and Visual Data

The affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help...

Full description

Bibliographic Details
Main Authors: Itaf Omar Joudeh, Ana-Maria Cretu, Stéphane Bouchard, Synthia Guimond
Format: Article
Language:English
Published: MDPI AG 2023-06-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/12/5613
_version_ 1827735661173538816
author Itaf Omar Joudeh
Ana-Maria Cretu
Stéphane Bouchard
Synthia Guimond
author_facet Itaf Omar Joudeh
Ana-Maria Cretu
Stéphane Bouchard
Synthia Guimond
author_sort Itaf Omar Joudeh
collection DOAJ
description The affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help facilitate cognitive remediation exercises for users with mental health disorders, such as schizophrenia, while avoiding discouragement. Building on our previous work on physiological, electrodermal activity (EDA) and electrocardiogram (ECG) recordings, we propose improving preprocessing and adding novel feature selection and decision fusion processes. We use video recordings as an additional data source for predicting affective states. We implement an innovative solution based on a combination of machine learning models alongside a series of preprocessing steps. We test our approach on RECOLA, a publicly available dataset. The best results are obtained with a concordance correlation coefficient (CCC) of 0.996 for arousal and 0.998 for valence using physiological data. Related work in the literature reported lower CCCs on the same data modality; thus, our approach outperforms the state-of-the-art approaches for RECOLA. Our study underscores the potential of using advanced machine learning techniques with diverse data sources to enhance the personalization of VR environments.
first_indexed 2024-03-11T01:56:55Z
format Article
id doaj.art-4c341a034f6343f685f0288c167ac5ef
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-11T01:56:55Z
publishDate 2023-06-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-4c341a034f6343f685f0288c167ac5ef2023-11-18T12:33:34ZengMDPI AGSensors1424-82202023-06-012312561310.3390/s23125613Prediction of Continuous Emotional Measures through Physiological and Visual DataItaf Omar Joudeh0Ana-Maria Cretu1Stéphane Bouchard2Synthia Guimond3Department of Computer Science and Engineering, University of Quebec in Outaouais, Gatineau, QC J8Y 3G5, CanadaDepartment of Computer Science and Engineering, University of Quebec in Outaouais, Gatineau, QC J8Y 3G5, CanadaDepartment of Psychoeducation and Psychology, University of Quebec in Outaouais, Gatineau, QC J8X 3X7, CanadaDepartment of Psychoeducation and Psychology, University of Quebec in Outaouais, Gatineau, QC J8X 3X7, CanadaThe affective state of a person can be measured using arousal and valence values. In this article, we contribute to the prediction of arousal and valence values from various data sources. Our goal is to later use such predictive models to adaptively adjust virtual reality (VR) environments and help facilitate cognitive remediation exercises for users with mental health disorders, such as schizophrenia, while avoiding discouragement. Building on our previous work on physiological, electrodermal activity (EDA) and electrocardiogram (ECG) recordings, we propose improving preprocessing and adding novel feature selection and decision fusion processes. We use video recordings as an additional data source for predicting affective states. We implement an innovative solution based on a combination of machine learning models alongside a series of preprocessing steps. We test our approach on RECOLA, a publicly available dataset. The best results are obtained with a concordance correlation coefficient (CCC) of 0.996 for arousal and 0.998 for valence using physiological data. Related work in the literature reported lower CCCs on the same data modality; thus, our approach outperforms the state-of-the-art approaches for RECOLA. Our study underscores the potential of using advanced machine learning techniques with diverse data sources to enhance the personalization of VR environments.https://www.mdpi.com/1424-8220/23/12/5613affect recognitionaffective statesignal processingimage processingface detectionmachine learning
spellingShingle Itaf Omar Joudeh
Ana-Maria Cretu
Stéphane Bouchard
Synthia Guimond
Prediction of Continuous Emotional Measures through Physiological and Visual Data
Sensors
affect recognition
affective state
signal processing
image processing
face detection
machine learning
title Prediction of Continuous Emotional Measures through Physiological and Visual Data
title_full Prediction of Continuous Emotional Measures through Physiological and Visual Data
title_fullStr Prediction of Continuous Emotional Measures through Physiological and Visual Data
title_full_unstemmed Prediction of Continuous Emotional Measures through Physiological and Visual Data
title_short Prediction of Continuous Emotional Measures through Physiological and Visual Data
title_sort prediction of continuous emotional measures through physiological and visual data
topic affect recognition
affective state
signal processing
image processing
face detection
machine learning
url https://www.mdpi.com/1424-8220/23/12/5613
work_keys_str_mv AT itafomarjoudeh predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT anamariacretu predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT stephanebouchard predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata
AT synthiaguimond predictionofcontinuousemotionalmeasuresthroughphysiologicalandvisualdata