A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN

Emotion recognition based on electroencephalograms has become an active research area. Yet, identifying emotions using only brainwaves is still very challenging, especially the subject-independent task. Numerous studies have tried to propose methods to recognize emotions, including machine learning...

Full description

Bibliographic Details
Main Authors: Panayu Keelawat, Nattapong Thammasan, Masayuki Numao, Boonserm Kijsirikul
Format: Article
Language:English
Published: MDPI AG 2021-03-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/21/5/1678
_version_ 1797416632153276416
author Panayu Keelawat
Nattapong Thammasan
Masayuki Numao
Boonserm Kijsirikul
author_facet Panayu Keelawat
Nattapong Thammasan
Masayuki Numao
Boonserm Kijsirikul
author_sort Panayu Keelawat
collection DOAJ
description Emotion recognition based on electroencephalograms has become an active research area. Yet, identifying emotions using only brainwaves is still very challenging, especially the subject-independent task. Numerous studies have tried to propose methods to recognize emotions, including machine learning techniques like convolutional neural network (CNN). Since CNN has shown its potential in generalization to unseen subjects, manipulating CNN hyperparameters like the window size and electrode order might be beneficial. To our knowledge, this is the first work that extensively observed the parameter selection effect on the CNN. The temporal information in distinct window sizes was found to significantly affect the recognition performance, and CNN was found to be more responsive to changing window sizes than the support vector machine. Classifying the arousal achieved the best performance with a window size of ten seconds, obtaining 56.85% accuracy and a Matthews correlation coefficient (MCC) of 0.1369. Valence recognition had the best performance with a window length of eight seconds at 73.34% accuracy and an MCC value of 0.4669. Spatial information from varying the electrode orders had a small effect on the classification. Overall, valence results had a much more superior performance than arousal results, which were, perhaps, influenced by features related to brain activity asymmetry between the left and right hemispheres.
first_indexed 2024-03-09T06:05:59Z
format Article
id doaj.art-16f2f1de4f66485280126c1b1c9b99ae
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T06:05:59Z
publishDate 2021-03-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-16f2f1de4f66485280126c1b1c9b99ae2023-12-03T12:02:49ZengMDPI AGSensors1424-82202021-03-01215167810.3390/s21051678A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNNPanayu Keelawat0Nattapong Thammasan1Masayuki Numao2Boonserm Kijsirikul3Department of Computer Science and Engineering, University of California San Diego, La Jolla, CA 92093-0404, USAHuman Media Interaction, Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, 7522 NB Enschede, The NetherlandsThe Institute of Scientific and Industrial Research, Osaka University, Mihogaoka, Ibaraki, Osaka 567-0047, JapanDepartment of Computer Engineering, Chulalongkorn University, Pathum Wan, Bangkok 10330, ThailandEmotion recognition based on electroencephalograms has become an active research area. Yet, identifying emotions using only brainwaves is still very challenging, especially the subject-independent task. Numerous studies have tried to propose methods to recognize emotions, including machine learning techniques like convolutional neural network (CNN). Since CNN has shown its potential in generalization to unseen subjects, manipulating CNN hyperparameters like the window size and electrode order might be beneficial. To our knowledge, this is the first work that extensively observed the parameter selection effect on the CNN. The temporal information in distinct window sizes was found to significantly affect the recognition performance, and CNN was found to be more responsive to changing window sizes than the support vector machine. Classifying the arousal achieved the best performance with a window size of ten seconds, obtaining 56.85% accuracy and a Matthews correlation coefficient (MCC) of 0.1369. Valence recognition had the best performance with a window length of eight seconds at 73.34% accuracy and an MCC value of 0.4669. Spatial information from varying the electrode orders had a small effect on the classification. Overall, valence results had a much more superior performance than arousal results, which were, perhaps, influenced by features related to brain activity asymmetry between the left and right hemispheres.https://www.mdpi.com/1424-8220/21/5/1678emotion recognitionEEGmachine learningCNNspatiotemporal databrainwave
spellingShingle Panayu Keelawat
Nattapong Thammasan
Masayuki Numao
Boonserm Kijsirikul
A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
Sensors
emotion recognition
EEG
machine learning
CNN
spatiotemporal data
brainwave
title A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
title_full A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
title_fullStr A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
title_full_unstemmed A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
title_short A Comparative Study of Window Size and Channel Arrangement on EEG-Emotion Recognition Using Deep CNN
title_sort comparative study of window size and channel arrangement on eeg emotion recognition using deep cnn
topic emotion recognition
EEG
machine learning
CNN
spatiotemporal data
brainwave
url https://www.mdpi.com/1424-8220/21/5/1678
work_keys_str_mv AT panayukeelawat acomparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT nattapongthammasan acomparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT masayukinumao acomparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT boonsermkijsirikul acomparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT panayukeelawat comparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT nattapongthammasan comparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT masayukinumao comparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn
AT boonsermkijsirikul comparativestudyofwindowsizeandchannelarrangementoneegemotionrecognitionusingdeepcnn