A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition

As a long-standing research topic in the field of brain–computer interface, emotion recognition still suffers from low recognition accuracy. In this research, we present a novel model named DE-CNN-BiLSTM deeply integrating the complexity of EEG signals, the spatial structure of brain and temporal co...

Full description

Bibliographic Details
Main Authors: Fachang Cui, Ruqing Wang, Weiwei Ding, Yao Chen, Liya Huang
Format: Article
Language:English
Published: MDPI AG 2022-02-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/10/4/582
_version_ 1827654186324459520
author Fachang Cui
Ruqing Wang
Weiwei Ding
Yao Chen
Liya Huang
author_facet Fachang Cui
Ruqing Wang
Weiwei Ding
Yao Chen
Liya Huang
author_sort Fachang Cui
collection DOAJ
description As a long-standing research topic in the field of brain–computer interface, emotion recognition still suffers from low recognition accuracy. In this research, we present a novel model named DE-CNN-BiLSTM deeply integrating the complexity of EEG signals, the spatial structure of brain and temporal contexts of emotion formation. Firstly, we extract the complexity properties of the EEG signal by calculating Differential Entropy in different time slices of different frequency bands to obtain 4D feature tensors according to brain location. Subsequently, the 4D tensors are input into the Convolutional Neural Network to learn brain structure and output time sequences; after that Bidirectional Long-Short Term Memory is used to learn past and future information of the time sequences. Compared with the existing emotion recognition models, the new model can decode the EEG signal deeply and extract key emotional features to improve accuracy. The simulation results show the algorithm achieves an average accuracy of 94% for DEAP dataset and 94.82% for SEED dataset, confirming its high accuracy and strong robustness.
first_indexed 2024-03-09T21:30:56Z
format Article
id doaj.art-e04aae6aaae142ceb608adecd1646126
institution Directory Open Access Journal
issn 2227-7390
language English
last_indexed 2024-03-09T21:30:56Z
publishDate 2022-02-01
publisher MDPI AG
record_format Article
series Mathematics
spelling doaj.art-e04aae6aaae142ceb608adecd16461262023-11-23T20:56:59ZengMDPI AGMathematics2227-73902022-02-0110458210.3390/math10040582A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion RecognitionFachang Cui0Ruqing Wang1Weiwei Ding2Yao Chen3Liya Huang4College of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, ChinaCollege of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, ChinaCollege of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, ChinaCollege of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, ChinaCollege of Electronic and Optical Engineering, Nanjing University of Posts and Telecommunications, Nanjing 210023, ChinaAs a long-standing research topic in the field of brain–computer interface, emotion recognition still suffers from low recognition accuracy. In this research, we present a novel model named DE-CNN-BiLSTM deeply integrating the complexity of EEG signals, the spatial structure of brain and temporal contexts of emotion formation. Firstly, we extract the complexity properties of the EEG signal by calculating Differential Entropy in different time slices of different frequency bands to obtain 4D feature tensors according to brain location. Subsequently, the 4D tensors are input into the Convolutional Neural Network to learn brain structure and output time sequences; after that Bidirectional Long-Short Term Memory is used to learn past and future information of the time sequences. Compared with the existing emotion recognition models, the new model can decode the EEG signal deeply and extract key emotional features to improve accuracy. The simulation results show the algorithm achieves an average accuracy of 94% for DEAP dataset and 94.82% for SEED dataset, confirming its high accuracy and strong robustness.https://www.mdpi.com/2227-7390/10/4/582emotion recognitionDEtemporal and spatial featureDE-CNN-BiLSTM
spellingShingle Fachang Cui
Ruqing Wang
Weiwei Ding
Yao Chen
Liya Huang
A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
Mathematics
emotion recognition
DE
temporal and spatial feature
DE-CNN-BiLSTM
title A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
title_full A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
title_fullStr A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
title_full_unstemmed A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
title_short A Novel DE-CNN-BiLSTM Multi-Fusion Model for EEG Emotion Recognition
title_sort novel de cnn bilstm multi fusion model for eeg emotion recognition
topic emotion recognition
DE
temporal and spatial feature
DE-CNN-BiLSTM
url https://www.mdpi.com/2227-7390/10/4/582
work_keys_str_mv AT fachangcui anoveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT ruqingwang anoveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT weiweiding anoveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT yaochen anoveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT liyahuang anoveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT fachangcui noveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT ruqingwang noveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT weiweiding noveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT yaochen noveldecnnbilstmmultifusionmodelforeegemotionrecognition
AT liyahuang noveldecnnbilstmmultifusionmodelforeegemotionrecognition