A deep learning-based brain-computer interaction system for speech and motor impairment

Abstract Some people may experience accidents, strokes, or diseases that lead to both motor and speech disabilities, making it difficult to communicate with others. Those with paralysis face daily challenges in meeting their basic needs, particularly if they have difficulty speaking. Individuals wit...

Full description

Bibliographic Details
Main Author: Nader A. Rahman Mohamed
Format: Article
Language:English
Published: SpringerOpen 2023-05-01
Series:Journal of Engineering and Applied Science
Subjects:
Online Access:https://doi.org/10.1186/s44147-023-00212-w
_version_ 1797827501781680128
author Nader A. Rahman Mohamed
author_facet Nader A. Rahman Mohamed
author_sort Nader A. Rahman Mohamed
collection DOAJ
description Abstract Some people may experience accidents, strokes, or diseases that lead to both motor and speech disabilities, making it difficult to communicate with others. Those with paralysis face daily challenges in meeting their basic needs, particularly if they have difficulty speaking. Individuals with dysarthria, amyotrophic lateral sclerosis, and similar conditions may find it challenging to understand speech. The proposed system for automatic recognition of daily basic needs aims to improve the quality of life for individuals suffering from dysarthria and quadriplegic paralysis. The system achieves this by recognizing and analyzing brain signals and converting them to either audible voice commands or texts that can be sent to a healthcare provider's mobile phone based on the system settings. The proposed system uses a convolutional neural network (CNN) model to detect event-related potentials (ERPs) within the EEG signal to select one of six basic daily needs while displaying their images randomly. Ten volunteers participated in this study, contributing to the creation of the dataset used for training, testing, and validation. The proposed approach achieved an accuracy of 78.41%.
first_indexed 2024-04-09T12:49:20Z
format Article
id doaj.art-50f7c419c9b744dabfdc6e9c876beee1
institution Directory Open Access Journal
issn 1110-1903
2536-9512
language English
last_indexed 2024-04-09T12:49:20Z
publishDate 2023-05-01
publisher SpringerOpen
record_format Article
series Journal of Engineering and Applied Science
spelling doaj.art-50f7c419c9b744dabfdc6e9c876beee12023-05-14T11:18:15ZengSpringerOpenJournal of Engineering and Applied Science1110-19032536-95122023-05-0170111810.1186/s44147-023-00212-wA deep learning-based brain-computer interaction system for speech and motor impairmentNader A. Rahman Mohamed0Biomedical Engineering Department, Faculty of Engineering, Misr University for Science and Technology (MUST)Abstract Some people may experience accidents, strokes, or diseases that lead to both motor and speech disabilities, making it difficult to communicate with others. Those with paralysis face daily challenges in meeting their basic needs, particularly if they have difficulty speaking. Individuals with dysarthria, amyotrophic lateral sclerosis, and similar conditions may find it challenging to understand speech. The proposed system for automatic recognition of daily basic needs aims to improve the quality of life for individuals suffering from dysarthria and quadriplegic paralysis. The system achieves this by recognizing and analyzing brain signals and converting them to either audible voice commands or texts that can be sent to a healthcare provider's mobile phone based on the system settings. The proposed system uses a convolutional neural network (CNN) model to detect event-related potentials (ERPs) within the EEG signal to select one of six basic daily needs while displaying their images randomly. Ten volunteers participated in this study, contributing to the creation of the dataset used for training, testing, and validation. The proposed approach achieved an accuracy of 78.41%.https://doi.org/10.1186/s44147-023-00212-wElectroencephalogram (EEG)Brain-Computer Interface (BCI)Event Related Potential (ERP)Convolutional Neural Network (CNN)
spellingShingle Nader A. Rahman Mohamed
A deep learning-based brain-computer interaction system for speech and motor impairment
Journal of Engineering and Applied Science
Electroencephalogram (EEG)
Brain-Computer Interface (BCI)
Event Related Potential (ERP)
Convolutional Neural Network (CNN)
title A deep learning-based brain-computer interaction system for speech and motor impairment
title_full A deep learning-based brain-computer interaction system for speech and motor impairment
title_fullStr A deep learning-based brain-computer interaction system for speech and motor impairment
title_full_unstemmed A deep learning-based brain-computer interaction system for speech and motor impairment
title_short A deep learning-based brain-computer interaction system for speech and motor impairment
title_sort deep learning based brain computer interaction system for speech and motor impairment
topic Electroencephalogram (EEG)
Brain-Computer Interface (BCI)
Event Related Potential (ERP)
Convolutional Neural Network (CNN)
url https://doi.org/10.1186/s44147-023-00212-w
work_keys_str_mv AT naderarahmanmohamed adeeplearningbasedbraincomputerinteractionsystemforspeechandmotorimpairment
AT naderarahmanmohamed deeplearningbasedbraincomputerinteractionsystemforspeechandmotorimpairment