See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions
Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that h...
Main Authors: | , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2021-11-01
|
Series: | Frontiers in Human Neuroscience |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/fnhum.2021.784522/full |
_version_ | 1819039487873777664 |
---|---|
author | Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger |
author_facet | Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger |
author_sort | Laurien Nagels-Coune |
collection | DOAJ |
description | Severely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability. |
first_indexed | 2024-12-21T08:54:00Z |
format | Article |
id | doaj.art-065239256e3040dbab5cf004d4b85dfc |
institution | Directory Open Access Journal |
issn | 1662-5161 |
language | English |
last_indexed | 2024-12-21T08:54:00Z |
publishDate | 2021-11-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Human Neuroscience |
spelling | doaj.art-065239256e3040dbab5cf004d4b85dfc2022-12-21T19:09:36ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612021-11-011510.3389/fnhum.2021.784522784522See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile InstructionsLaurien Nagels-Coune0Laurien Nagels-Coune1Laurien Nagels-Coune2Lars Riecke3Lars Riecke4Amaia Benitez-Andonegui5Amaia Benitez-Andonegui6Amaia Benitez-Andonegui7Simona Klinkhammer8Simona Klinkhammer9Rainer Goebel10Rainer Goebel11Rainer Goebel12Peter De Weerd13Peter De Weerd14Peter De Weerd15Michael Lührs16Bettina Sorger17Bettina Sorger18Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsZorggroep Sint-Kamillus, Bierbeek, BelgiumDepartment of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsDepartment of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsMEG Core Facility, National Institutes of Mental Health, Bethesda, MD, United StatesDepartment of Psychiatry and Neuropsychology, Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, NetherlandsSchool for Mental Health and Neuroscience, Maastricht University, Maastricht, NetherlandsDepartment of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsBrain Innovation B.V., Maastricht, NetherlandsDepartment of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsMaastricht Centre for Systems Biology, Maastricht University, Maastricht, NetherlandsBrain Innovation B.V., Maastricht, NetherlandsDepartment of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, NetherlandsMaastricht Brain Imaging Center, Maastricht, NetherlandsSeverely motor-disabled patients, such as those suffering from the so-called “locked-in” syndrome, cannot communicate naturally. They may benefit from brain-computer interfaces (BCIs) exploiting brain signals for communication and therewith circumventing the muscular system. One BCI technique that has gained attention recently is functional near-infrared spectroscopy (fNIRS). Typically, fNIRS-based BCIs allow for brain-based communication via voluntarily modulation of brain activity through mental task performance guided by visual or auditory instructions. While the development of fNIRS-BCIs has made great progress, the reliability of fNIRS-BCIs across time and environments has rarely been assessed. In the present fNIRS-BCI study, we tested six healthy participants across three consecutive days using a straightforward four-choice fNIRS-BCI communication paradigm that allows answer encoding based on instructions using various sensory modalities. To encode an answer, participants performed a motor imagery task (mental drawing) in one out of four time periods. Answer encoding was guided by either the visual, auditory, or tactile sensory modality. Two participants were tested outside the laboratory in a cafeteria. Answers were decoded from the time course of the most-informative fNIRS channel-by-chromophore combination. Across the three testing days, we obtained mean single- and multi-trial (joint analysis of four consecutive trials) accuracies of 62.5 and 85.19%, respectively. Obtained multi-trial accuracies were 86.11% for visual, 80.56% for auditory, and 88.89% for tactile sensory encoding. The two participants that used the fNIRS-BCI in a cafeteria obtained the best single- (72.22 and 77.78%) and multi-trial accuracies (100 and 94.44%). Communication was reliable over the three recording sessions with multi-trial accuracies of 86.11% on day 1, 86.11% on day 2, and 83.33% on day 3. To gauge the trade-off between number of optodes and decoding accuracy, averaging across two and three promising fNIRS channels was compared to the one-channel approach. Multi-trial accuracy increased from 85.19% (one-channel approach) to 91.67% (two-/three-channel approach). In sum, the presented fNIRS-BCI yielded robust decoding results using three alternative sensory encoding modalities. Further, fNIRS-BCI communication was stable over the course of three consecutive days, even in a natural (social) environment. Therewith, the developed fNIRS-BCI demonstrated high flexibility, reliability and robustness, crucial requirements for future clinical applicability.https://www.frontiersin.org/articles/10.3389/fnhum.2021.784522/fullfunctional near-infrared spectroscopy (fNIRS)brain-computer interface (BCI)motor imagery (MI)mental drawingsensory encoding modalityfour-choice communication |
spellingShingle | Laurien Nagels-Coune Laurien Nagels-Coune Laurien Nagels-Coune Lars Riecke Lars Riecke Amaia Benitez-Andonegui Amaia Benitez-Andonegui Amaia Benitez-Andonegui Simona Klinkhammer Simona Klinkhammer Rainer Goebel Rainer Goebel Rainer Goebel Peter De Weerd Peter De Weerd Peter De Weerd Michael Lührs Bettina Sorger Bettina Sorger See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions Frontiers in Human Neuroscience functional near-infrared spectroscopy (fNIRS) brain-computer interface (BCI) motor imagery (MI) mental drawing sensory encoding modality four-choice communication |
title | See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_full | See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_fullStr | See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_full_unstemmed | See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_short | See, Hear, or Feel – to Speak: A Versatile Multiple-Choice Functional Near-Infrared Spectroscopy-Brain-Computer Interface Feasible With Visual, Auditory, or Tactile Instructions |
title_sort | see hear or feel to speak a versatile multiple choice functional near infrared spectroscopy brain computer interface feasible with visual auditory or tactile instructions |
topic | functional near-infrared spectroscopy (fNIRS) brain-computer interface (BCI) motor imagery (MI) mental drawing sensory encoding modality four-choice communication |
url | https://www.frontiersin.org/articles/10.3389/fnhum.2021.784522/full |
work_keys_str_mv | AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT lauriennagelscoune seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT larsriecke seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT larsriecke seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT amaiabenitezandonegui seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT simonaklinkhammer seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT simonaklinkhammer seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT rainergoebel seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT peterdeweerd seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT michaelluhrs seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT bettinasorger seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions AT bettinasorger seehearorfeeltospeakaversatilemultiplechoicefunctionalnearinfraredspectroscopybraincomputerinterfacefeasiblewithvisualauditoryortactileinstructions |