Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module

Because of its ability to objectively reflect people’s emotional states, electroencephalogram (EEG) has been attracting increasing research attention for emotion classification. The classification method based on spatial-domain analysis is one of the research hotspots. However, most previous studies...

Full description

Bibliographic Details
Main Authors: Xiaoliang Zhu, Gendong Liu, Liang Zhao, Wenting Rong, Junyi Sun, Ran Liu
Format: Article
Language:English
Published: MDPI AG 2023-02-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/4/1917
_version_ 1797618315098587136
author Xiaoliang Zhu
Gendong Liu
Liang Zhao
Wenting Rong
Junyi Sun
Ran Liu
author_facet Xiaoliang Zhu
Gendong Liu
Liang Zhao
Wenting Rong
Junyi Sun
Ran Liu
author_sort Xiaoliang Zhu
collection DOAJ
description Because of its ability to objectively reflect people’s emotional states, electroencephalogram (EEG) has been attracting increasing research attention for emotion classification. The classification method based on spatial-domain analysis is one of the research hotspots. However, most previous studies ignored the complementarity of information between different frequency bands, and the information in a single frequency band is not fully mined, which increases the computational time and the difficulty of improving classification accuracy. To address the above problems, this study proposes an emotion classification method based on dynamic simplifying graph convolutional (SGC) networks and a style recalibration module (SRM) for channels, termed SGC-SRM, with multi-band EEG data as input. Specifically, first, the graph structure is constructed using the differential entropy characteristics of each sub-band and the internal relationship between different channels is dynamically learned through SGC networks. Second, a convolution layer based on the SRM is introduced to recalibrate channel features to extract more emotion-related features. Third, the extracted sub-band features are fused at the feature level and classified. In addition, to reduce the redundant information between EEG channels and the computational time, (1) we adopt only 12 channels that are suitable for emotion classification to optimize the recognition algorithm, which can save approximately 90.5% of the time cost compared with using all channels; (2) we adopt information in the θ, α, β, and γ bands, consequently saving 23.3% of the time consumed compared with that in the full bands while maintaining almost the same level of classification accuracy. Finally, a subject-independent experiment is conducted on the public SEED dataset using the leave-one-subject-out cross-validation strategy. According to experimental results, SGC-SRM improves classification accuracy by 5.51–15.43% compared with existing methods.
first_indexed 2024-03-11T08:11:24Z
format Article
id doaj.art-f351c72d002f4fd3a81afa37265efcfa
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-11T08:11:24Z
publishDate 2023-02-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-f351c72d002f4fd3a81afa37265efcfa2023-11-16T23:07:51ZengMDPI AGSensors1424-82202023-02-01234191710.3390/s23041917Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration ModuleXiaoliang Zhu0Gendong Liu1Liang Zhao2Wenting Rong3Junyi Sun4Ran Liu5National Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, ChinaNational Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, ChinaNational Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, ChinaNational Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, ChinaNational Engineering Research Center of Educational Big Data, Central China Normal University, Wuhan 430079, ChinaEngineering Product Development Pillar, Singapore University of Technology and Design, Singapore 487372, SingaporeBecause of its ability to objectively reflect people’s emotional states, electroencephalogram (EEG) has been attracting increasing research attention for emotion classification. The classification method based on spatial-domain analysis is one of the research hotspots. However, most previous studies ignored the complementarity of information between different frequency bands, and the information in a single frequency band is not fully mined, which increases the computational time and the difficulty of improving classification accuracy. To address the above problems, this study proposes an emotion classification method based on dynamic simplifying graph convolutional (SGC) networks and a style recalibration module (SRM) for channels, termed SGC-SRM, with multi-band EEG data as input. Specifically, first, the graph structure is constructed using the differential entropy characteristics of each sub-band and the internal relationship between different channels is dynamically learned through SGC networks. Second, a convolution layer based on the SRM is introduced to recalibrate channel features to extract more emotion-related features. Third, the extracted sub-band features are fused at the feature level and classified. In addition, to reduce the redundant information between EEG channels and the computational time, (1) we adopt only 12 channels that are suitable for emotion classification to optimize the recognition algorithm, which can save approximately 90.5% of the time cost compared with using all channels; (2) we adopt information in the θ, α, β, and γ bands, consequently saving 23.3% of the time consumed compared with that in the full bands while maintaining almost the same level of classification accuracy. Finally, a subject-independent experiment is conducted on the public SEED dataset using the leave-one-subject-out cross-validation strategy. According to experimental results, SGC-SRM improves classification accuracy by 5.51–15.43% compared with existing methods.https://www.mdpi.com/1424-8220/23/4/1917EEGemotion classificationgraph neural networkSRMchannel selection
spellingShingle Xiaoliang Zhu
Gendong Liu
Liang Zhao
Wenting Rong
Junyi Sun
Ran Liu
Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
Sensors
EEG
emotion classification
graph neural network
SRM
channel selection
title Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
title_full Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
title_fullStr Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
title_full_unstemmed Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
title_short Emotion Classification from Multi-Band Electroencephalogram Data Using Dynamic Simplifying Graph Convolutional Network and Channel Style Recalibration Module
title_sort emotion classification from multi band electroencephalogram data using dynamic simplifying graph convolutional network and channel style recalibration module
topic EEG
emotion classification
graph neural network
SRM
channel selection
url https://www.mdpi.com/1424-8220/23/4/1917
work_keys_str_mv AT xiaoliangzhu emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule
AT gendongliu emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule
AT liangzhao emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule
AT wentingrong emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule
AT junyisun emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule
AT ranliu emotionclassificationfrommultibandelectroencephalogramdatausingdynamicsimplifyinggraphconvolutionalnetworkandchannelstylerecalibrationmodule