A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli
To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-08-01
|
Series: | Brain Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3425/10/8/524 |
_version_ | 1797559902076403712 |
---|---|
author | Boyang Zhang Zongtan Zhou Jing Jiang |
author_facet | Boyang Zhang Zongtan Zhou Jing Jiang |
author_sort | Boyang Zhang |
collection | DOAJ |
description | To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications. |
first_indexed | 2024-03-10T17:51:46Z |
format | Article |
id | doaj.art-5f04394f5ec042b7b26852cd7f250855 |
institution | Directory Open Access Journal |
issn | 2076-3425 |
language | English |
last_indexed | 2024-03-10T17:51:46Z |
publishDate | 2020-08-01 |
publisher | MDPI AG |
record_format | Article |
series | Brain Sciences |
spelling | doaj.art-5f04394f5ec042b7b26852cd7f2508552023-11-20T09:19:49ZengMDPI AGBrain Sciences2076-34252020-08-0110852410.3390/brainsci10080524A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile StimuliBoyang Zhang0Zongtan Zhou1Jing Jiang2College of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaCollege of Intelligence Science and Technology, National University of Defense Technology, Changsha 410073, ChinaNational Key Laboratory of Human Factors Engineering, China Astronaut Research and Training Center, Beijing 100094, ChinaTo date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications.https://www.mdpi.com/2076-3425/10/8/524BCIERPauditoryelectro-tactilebimodal stimuluslocation-congruent |
spellingShingle | Boyang Zhang Zongtan Zhou Jing Jiang A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli Brain Sciences BCI ERP auditory electro-tactile bimodal stimulus location-congruent |
title | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_full | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_fullStr | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_full_unstemmed | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_short | A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli |
title_sort | 36 class bimodal erp brain computer interface using location congruent auditory tactile stimuli |
topic | BCI ERP auditory electro-tactile bimodal stimulus location-congruent |
url | https://www.mdpi.com/2076-3425/10/8/524 |
work_keys_str_mv | AT boyangzhang a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT zongtanzhou a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT jingjiang a36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT boyangzhang 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT zongtanzhou 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli AT jingjiang 36classbimodalerpbraincomputerinterfaceusinglocationcongruentauditorytactilestimuli |