Prediction of Emotional Measures via Electrodermal Activity (EDA) and Electrocardiogram (ECG)

Affect recognition is a signal and pattern recognition problem that plays a major role in affective computing. The affective state of a person reflects their emotional state, which could be measured through the arousal and valence dimensions, as per the circumplex model. We attempt to predict the ar...

Full description

Bibliographic Details
Main Authors: Itaf Omar Joudeh, Ana-Maria Cretu, Synthia Guimond, Stéphane Bouchard
Format: Article
Language:English
Published: MDPI AG 2022-11-01
Series:Engineering Proceedings
Subjects:
Online Access:https://www.mdpi.com/2673-4591/27/1/47
Description
Summary:Affect recognition is a signal and pattern recognition problem that plays a major role in affective computing. The affective state of a person reflects their emotional state, which could be measured through the arousal and valence dimensions, as per the circumplex model. We attempt to predict the arousal and valence values by exploiting the Remote Collaborative and Affective Interactions (RECOLA) data set RECOLA is a publicly available data set of spontaneous and natural interactions that represent various human emotional and social behaviours, recorded as audio, video, electrodermal activity (EDA) and electrocardiogram (ECG) biomedical signals. In this work, we focus on the biomedical signal recordings contained in RECOLA. The signals are processed, accompanied with pre-extracted features, and accordingly labelled with their corresponding arousal or valence annotations. EDA and ECG features are fused at feature-level. Ensemble regressors are then trained and tested to predict arousal and valence values. The best performance is achieved by optimizable ensemble regression, with a testing root mean squared error (RMSE) of 0.0154 for arousal and 0.0139 for valence predictions. Our solution has achieved good prediction performance for the arousal and valence measures, using EDA and ECG features. Future work will integrate visual data into the solution.
ISSN:2673-4591