Enhancing motor imagery detection efficacy using multisensory virtual reality priming

Brain-computer interfaces (BCI) have been developed to allow users to communicate with the external world by translating brain activity into control signals. Motor imagery (MI) has been a popular paradigm in BCI control where the user imagines movements of e.g., their left and right limbs and classi...

Full description

Bibliographic Details
Main Authors: Reza Amini Gougeh, Tiago H. Falk
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-04-01
Series:Frontiers in Neuroergonomics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnrgo.2023.1080200/full
_version_ 1797851403466571776
author Reza Amini Gougeh
Tiago H. Falk
author_facet Reza Amini Gougeh
Tiago H. Falk
author_sort Reza Amini Gougeh
collection DOAJ
description Brain-computer interfaces (BCI) have been developed to allow users to communicate with the external world by translating brain activity into control signals. Motor imagery (MI) has been a popular paradigm in BCI control where the user imagines movements of e.g., their left and right limbs and classifiers are then trained to detect such intent directly from electroencephalography (EEG) signals. For some users, however, it is difficult to elicit patterns in the EEG signal that can be detected with existing features and classifiers. As such, new user control strategies and training paradigms have been highly sought-after to help improve motor imagery performance. Virtual reality (VR) has emerged as one potential tool where improvements in user engagement and level of immersion have shown to improve BCI accuracy. Motor priming in VR, in turn, has shown to further enhance BCI accuracy. In this pilot study, we take the first steps to explore if multisensory VR motor priming, where haptic and olfactory stimuli are present, can improve motor imagery detection efficacy in terms of both improved accuracy and faster detection. Experiments with 10 participants equipped with a biosensor-embedded VR headset, an off-the-shelf scent diffusion device, and a haptic glove with force feedback showed that significant improvements in motor imagery detection could be achieved. Increased activity in the six common spatial pattern filters used were also observed and peak accuracy could be achieved with analysis windows that were 2 s shorter. Combined, the results suggest that multisensory motor priming prior to motor imagery could improve detection efficacy.
first_indexed 2024-04-09T19:17:21Z
format Article
id doaj.art-e9830d3514194174b8ec88fc4812e909
institution Directory Open Access Journal
issn 2673-6195
language English
last_indexed 2024-04-09T19:17:21Z
publishDate 2023-04-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neuroergonomics
spelling doaj.art-e9830d3514194174b8ec88fc4812e9092023-04-06T05:09:19ZengFrontiers Media S.A.Frontiers in Neuroergonomics2673-61952023-04-01410.3389/fnrgo.2023.10802001080200Enhancing motor imagery detection efficacy using multisensory virtual reality primingReza Amini GougehTiago H. FalkBrain-computer interfaces (BCI) have been developed to allow users to communicate with the external world by translating brain activity into control signals. Motor imagery (MI) has been a popular paradigm in BCI control where the user imagines movements of e.g., their left and right limbs and classifiers are then trained to detect such intent directly from electroencephalography (EEG) signals. For some users, however, it is difficult to elicit patterns in the EEG signal that can be detected with existing features and classifiers. As such, new user control strategies and training paradigms have been highly sought-after to help improve motor imagery performance. Virtual reality (VR) has emerged as one potential tool where improvements in user engagement and level of immersion have shown to improve BCI accuracy. Motor priming in VR, in turn, has shown to further enhance BCI accuracy. In this pilot study, we take the first steps to explore if multisensory VR motor priming, where haptic and olfactory stimuli are present, can improve motor imagery detection efficacy in terms of both improved accuracy and faster detection. Experiments with 10 participants equipped with a biosensor-embedded VR headset, an off-the-shelf scent diffusion device, and a haptic glove with force feedback showed that significant improvements in motor imagery detection could be achieved. Increased activity in the six common spatial pattern filters used were also observed and peak accuracy could be achieved with analysis windows that were 2 s shorter. Combined, the results suggest that multisensory motor priming prior to motor imagery could improve detection efficacy.https://www.frontiersin.org/articles/10.3389/fnrgo.2023.1080200/fullbrain-computer interfacemotor imagerymultisensory primingvirtual realityhapticsforce feedback
spellingShingle Reza Amini Gougeh
Tiago H. Falk
Enhancing motor imagery detection efficacy using multisensory virtual reality priming
Frontiers in Neuroergonomics
brain-computer interface
motor imagery
multisensory priming
virtual reality
haptics
force feedback
title Enhancing motor imagery detection efficacy using multisensory virtual reality priming
title_full Enhancing motor imagery detection efficacy using multisensory virtual reality priming
title_fullStr Enhancing motor imagery detection efficacy using multisensory virtual reality priming
title_full_unstemmed Enhancing motor imagery detection efficacy using multisensory virtual reality priming
title_short Enhancing motor imagery detection efficacy using multisensory virtual reality priming
title_sort enhancing motor imagery detection efficacy using multisensory virtual reality priming
topic brain-computer interface
motor imagery
multisensory priming
virtual reality
haptics
force feedback
url https://www.frontiersin.org/articles/10.3389/fnrgo.2023.1080200/full
work_keys_str_mv AT rezaaminigougeh enhancingmotorimagerydetectionefficacyusingmultisensoryvirtualrealitypriming
AT tiagohfalk enhancingmotorimagerydetectionefficacyusingmultisensoryvirtualrealitypriming