The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction
In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emo...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-04-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/20/8/2308 |
_version_ | 1797570304911867904 |
---|---|
author | Dilana Hazer-Rau Sascha Meudt Andreas Daucher Jennifer Spohrs Holger Hoffmann Friedhelm Schwenker Harald C. Traue |
author_facet | Dilana Hazer-Rau Sascha Meudt Andreas Daucher Jennifer Spohrs Holger Hoffmann Friedhelm Schwenker Harald C. Traue |
author_sort | Dilana Hazer-Rau |
collection | DOAJ |
description | In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing <i>Interest, Overload, Normal, Easy, Underload</i>, and <i>Frustration</i>. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the <i>University of Ulm Multimodal Affective Corpus (uulmMAC)</i>, consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final <i>uulmMAC</i> dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our <i>uulmMAC</i> database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications. |
first_indexed | 2024-03-10T20:23:03Z |
format | Article |
id | doaj.art-41f3a0674fdf4bb0a70d5d23ef5621f6 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-10T20:23:03Z |
publishDate | 2020-04-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-41f3a0674fdf4bb0a70d5d23ef5621f62023-11-19T21:58:33ZengMDPI AGSensors1424-82202020-04-01208230810.3390/s20082308The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer InteractionDilana Hazer-Rau0Sascha Meudt1Andreas Daucher2Jennifer Spohrs3Holger Hoffmann4Friedhelm Schwenker5Harald C. Traue6Section Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, GermanyInstitute of Neural Information Processing, University of Ulm, James-Frank-Ring, 89081 Ulm, GermanySection Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, GermanySection Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, GermanySection Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, GermanyInstitute of Neural Information Processing, University of Ulm, James-Frank-Ring, 89081 Ulm, GermanySection Medical Psychology, University of Ulm, Frauensteige 6, 89075 Ulm, GermanyIn this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing <i>Interest, Overload, Normal, Easy, Underload</i>, and <i>Frustration</i>. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the <i>University of Ulm Multimodal Affective Corpus (uulmMAC)</i>, consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final <i>uulmMAC</i> dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our <i>uulmMAC</i> database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.https://www.mdpi.com/1424-8220/20/8/2308affective corpusmultimodal sensorsoverloadunderloadinterestfrustration |
spellingShingle | Dilana Hazer-Rau Sascha Meudt Andreas Daucher Jennifer Spohrs Holger Hoffmann Friedhelm Schwenker Harald C. Traue The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction Sensors affective corpus multimodal sensors overload underload interest frustration |
title | The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction |
title_full | The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction |
title_fullStr | The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction |
title_full_unstemmed | The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction |
title_short | The uulmMAC Database—A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction |
title_sort | uulmmac database a multimodal affective corpus for affective computing in human computer interaction |
topic | affective corpus multimodal sensors overload underload interest frustration |
url | https://www.mdpi.com/1424-8220/20/8/2308 |
work_keys_str_mv | AT dilanahazerrau theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT saschameudt theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT andreasdaucher theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT jenniferspohrs theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT holgerhoffmann theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT friedhelmschwenker theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT haraldctraue theuulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT dilanahazerrau uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT saschameudt uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT andreasdaucher uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT jenniferspohrs uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT holgerhoffmann uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT friedhelmschwenker uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction AT haraldctraue uulmmacdatabaseamultimodalaffectivecorpusforaffectivecomputinginhumancomputerinteraction |