ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement
A brain-computer interface (BCI), which provides an advanced direct human-machine interaction, has gained substantial research interest in the last decade for its great potential in various applications including rehabilitation and communication. Among them, the P300-based BCI speller is a typical a...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10018278/ |
_version_ | 1797805048587091968 |
---|---|
author | Zehui Wang Chuangquan Chen Junhua Li Feng Wan Yu Sun Hongtao Wang |
author_facet | Zehui Wang Chuangquan Chen Junhua Li Feng Wan Yu Sun Hongtao Wang |
author_sort | Zehui Wang |
collection | DOAJ |
description | A brain-computer interface (BCI), which provides an advanced direct human-machine interaction, has gained substantial research interest in the last decade for its great potential in various applications including rehabilitation and communication. Among them, the P300-based BCI speller is a typical application that is capable of identifying the expected stimulated characters. However, the applicability of the P300 speller is hampered for the low recognition rate partially attributed to the complex spatio-temporal characteristics of the EEG signals. Here, we developed a deep-learning analysis framework named ST-CapsNet to overcome the challenges regarding better P300 detection using a capsule network with both spatial and temporal attention modules. Specifically, we first employed spatial and temporal attention modules to obtain refined EEG signals by capturing event-related information. Then the obtained signals were fed into the capsule network for discriminative feature extraction and P300 det- ection. In order to quantitatively assess the performance of the proposed ST-CapsNet, two publicly-available datasets (i.e., Dataset IIb of BCI Competition 2003 and Dataset II of BCI Competition III) were applied. A new metric of averaged symbols under repetitions (ASUR) was adopted to evaluate the cumulative effect of symbol recognition under different repetitions. In comparison with several widely-used methods (i.e., LDA, ERP-CapsNet, CNN, MCNN, SWFP, and MsCNN-TL-ESVM), the proposed ST-CapsNet framework significantly outperformed the state-of-the-art methods in terms of ASUR. More interestingly, the absolute values of the spatial filters learned by ST-CapsNet are higher in the parietal lobe and occipital region, which is consistent with the generation mechanism of P300. |
first_indexed | 2024-03-13T05:46:10Z |
format | Article |
id | doaj.art-cfea1edd203042c183830681aaaba886 |
institution | Directory Open Access Journal |
issn | 1558-0210 |
language | English |
last_indexed | 2024-03-13T05:46:10Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
spelling | doaj.art-cfea1edd203042c183830681aaaba8862023-06-13T20:10:29ZengIEEEIEEE Transactions on Neural Systems and Rehabilitation Engineering1558-02102023-01-0131991100010.1109/TNSRE.2023.323731910018278ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection ImprovementZehui Wang0Chuangquan Chen1https://orcid.org/0000-0002-3811-296XJunhua Li2https://orcid.org/0000-0001-5078-1712Feng Wan3https://orcid.org/0000-0002-9359-0737Yu Sun4https://orcid.org/0000-0002-6666-8586Hongtao Wang5https://orcid.org/0000-0002-6564-5753Faculty of Intelligent Manufacturing, Wuyi University, Jiangmen, ChinaFaculty of Intelligent Manufacturing, Wuyi University, Jiangmen, ChinaFaculty of Intelligent Manufacturing, Wuyi University, Jiangmen, ChinaDepartment of Electrical and Computer Engineering, Faculty of Science and Engineering, and the Centre for Cognitive and Brain Sciences, Institute of Collaborative Innovation, University of Macau, Macau, ChinaDepartment of Biomedical Engineering, Key Laboratory for Biomedical Engineering of Ministry of Education of China, Zhejiang University, Hangzhou, ChinaFaculty of Intelligent Manufacturing, Wuyi University, Jiangmen, ChinaA brain-computer interface (BCI), which provides an advanced direct human-machine interaction, has gained substantial research interest in the last decade for its great potential in various applications including rehabilitation and communication. Among them, the P300-based BCI speller is a typical application that is capable of identifying the expected stimulated characters. However, the applicability of the P300 speller is hampered for the low recognition rate partially attributed to the complex spatio-temporal characteristics of the EEG signals. Here, we developed a deep-learning analysis framework named ST-CapsNet to overcome the challenges regarding better P300 detection using a capsule network with both spatial and temporal attention modules. Specifically, we first employed spatial and temporal attention modules to obtain refined EEG signals by capturing event-related information. Then the obtained signals were fed into the capsule network for discriminative feature extraction and P300 det- ection. In order to quantitatively assess the performance of the proposed ST-CapsNet, two publicly-available datasets (i.e., Dataset IIb of BCI Competition 2003 and Dataset II of BCI Competition III) were applied. A new metric of averaged symbols under repetitions (ASUR) was adopted to evaluate the cumulative effect of symbol recognition under different repetitions. In comparison with several widely-used methods (i.e., LDA, ERP-CapsNet, CNN, MCNN, SWFP, and MsCNN-TL-ESVM), the proposed ST-CapsNet framework significantly outperformed the state-of-the-art methods in terms of ASUR. More interestingly, the absolute values of the spatial filters learned by ST-CapsNet are higher in the parietal lobe and occipital region, which is consistent with the generation mechanism of P300.https://ieeexplore.ieee.org/document/10018278/Brain-computer interfaces (BCIs)capsule networkP300attention |
spellingShingle | Zehui Wang Chuangquan Chen Junhua Li Feng Wan Yu Sun Hongtao Wang ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement IEEE Transactions on Neural Systems and Rehabilitation Engineering Brain-computer interfaces (BCIs) capsule network P300 attention |
title | ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement |
title_full | ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement |
title_fullStr | ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement |
title_full_unstemmed | ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement |
title_short | ST-CapsNet: Linking Spatial and Temporal Attention With Capsule Network for P300 Detection Improvement |
title_sort | st capsnet linking spatial and temporal attention with capsule network for p300 detection improvement |
topic | Brain-computer interfaces (BCIs) capsule network P300 attention |
url | https://ieeexplore.ieee.org/document/10018278/ |
work_keys_str_mv | AT zehuiwang stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement AT chuangquanchen stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement AT junhuali stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement AT fengwan stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement AT yusun stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement AT hongtaowang stcapsnetlinkingspatialandtemporalattentionwithcapsulenetworkforp300detectionimprovement |