Dual attentive fusion for EEG-based brain-computer interfaces

The classification based on Electroencephalogram (EEG) is a challenging task in the brain-computer interface (BCI) field due to data with a low signal-to-noise ratio. Most current deep learning based studies in this challenge focus on designing a desired convolutional neural network (CNN) to learn a...

Full description

Bibliographic Details
Main Authors: Yuanhua Du, Jian Huang, Xiuyu Huang, Kaibo Shi, Nan Zhou
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-11-01
Series:Frontiers in Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnins.2022.1044631/full
_version_ 1811186733249724416
author Yuanhua Du
Jian Huang
Xiuyu Huang
Kaibo Shi
Nan Zhou
author_facet Yuanhua Du
Jian Huang
Xiuyu Huang
Kaibo Shi
Nan Zhou
author_sort Yuanhua Du
collection DOAJ
description The classification based on Electroencephalogram (EEG) is a challenging task in the brain-computer interface (BCI) field due to data with a low signal-to-noise ratio. Most current deep learning based studies in this challenge focus on designing a desired convolutional neural network (CNN) to learn and classify the raw EEG signals. However, only CNN itself may not capture the highly discriminative patterns of EEG due to a lack of exploration of attentive spatial and temporal dynamics. To improve information utilization, this study proposes a Dual Attentive Fusion Model (DAFM) for the EEG-based BCI. DAFM is employed to capture the spatial and temporal information by modeling the interdependencies between the features from the EEG signals. To our best knowledge, our method is the first to fuse spatial and temporal dimensions in an interactive attention module. This module improves the expression ability of the extracted features. Extensive experiments implemented on four publicly available datasets demonstrate that our method outperforms state-of-the-art methods. Meanwhile, this work also indicates the effectiveness of Dual Attentive Fusion Module.
first_indexed 2024-04-11T13:51:27Z
format Article
id doaj.art-55492afbfd324e759fe5a7ba7023b30f
institution Directory Open Access Journal
issn 1662-453X
language English
last_indexed 2024-04-11T13:51:27Z
publishDate 2022-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroscience
spelling doaj.art-55492afbfd324e759fe5a7ba7023b30f2022-12-22T04:20:35ZengFrontiers Media S.A.Frontiers in Neuroscience1662-453X2022-11-011610.3389/fnins.2022.10446311044631Dual attentive fusion for EEG-based brain-computer interfacesYuanhua Du0Jian Huang1Xiuyu Huang2Kaibo Shi3Nan Zhou4College of Applied Mathematics, Chengdu University of Information Technology, Chengdu, ChinaCollege of Applied Mathematics, Chengdu University of Information Technology, Chengdu, ChinaCentre for Smart Health, The Hong Kong Polytechnic University, Hong Kong, Hong Kong SAR, ChinaSchool of Electronic Information and Electronic Engineering, Chengdu University, Chengdu, ChinaSchool of Electronic Information and Electronic Engineering, Chengdu University, Chengdu, ChinaThe classification based on Electroencephalogram (EEG) is a challenging task in the brain-computer interface (BCI) field due to data with a low signal-to-noise ratio. Most current deep learning based studies in this challenge focus on designing a desired convolutional neural network (CNN) to learn and classify the raw EEG signals. However, only CNN itself may not capture the highly discriminative patterns of EEG due to a lack of exploration of attentive spatial and temporal dynamics. To improve information utilization, this study proposes a Dual Attentive Fusion Model (DAFM) for the EEG-based BCI. DAFM is employed to capture the spatial and temporal information by modeling the interdependencies between the features from the EEG signals. To our best knowledge, our method is the first to fuse spatial and temporal dimensions in an interactive attention module. This module improves the expression ability of the extracted features. Extensive experiments implemented on four publicly available datasets demonstrate that our method outperforms state-of-the-art methods. Meanwhile, this work also indicates the effectiveness of Dual Attentive Fusion Module.https://www.frontiersin.org/articles/10.3389/fnins.2022.1044631/fullbrain-computer interfaceelectroencephalographyP300motor imagerydual attentive fusion
spellingShingle Yuanhua Du
Jian Huang
Xiuyu Huang
Kaibo Shi
Nan Zhou
Dual attentive fusion for EEG-based brain-computer interfaces
Frontiers in Neuroscience
brain-computer interface
electroencephalography
P300
motor imagery
dual attentive fusion
title Dual attentive fusion for EEG-based brain-computer interfaces
title_full Dual attentive fusion for EEG-based brain-computer interfaces
title_fullStr Dual attentive fusion for EEG-based brain-computer interfaces
title_full_unstemmed Dual attentive fusion for EEG-based brain-computer interfaces
title_short Dual attentive fusion for EEG-based brain-computer interfaces
title_sort dual attentive fusion for eeg based brain computer interfaces
topic brain-computer interface
electroencephalography
P300
motor imagery
dual attentive fusion
url https://www.frontiersin.org/articles/10.3389/fnins.2022.1044631/full
work_keys_str_mv AT yuanhuadu dualattentivefusionforeegbasedbraincomputerinterfaces
AT jianhuang dualattentivefusionforeegbasedbraincomputerinterfaces
AT xiuyuhuang dualattentivefusionforeegbasedbraincomputerinterfaces
AT kaiboshi dualattentivefusionforeegbasedbraincomputerinterfaces
AT nanzhou dualattentivefusionforeegbasedbraincomputerinterfaces