Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton
In this paper, we propose to use brain-computer interface (BCI) to control a lower-limb exoskeleton. The exoskeleton follows the wearer's motion intention through decoding of electroencephalography (EEG) signals and multi-modal cognition. Motion patterns as standing up, sitting down, and walkin...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2018-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8405532/ |
_version_ | 1818558499013001216 |
---|---|
author | Can Wang Xinyu Wu Zhouyang Wang Yue Ma |
author_facet | Can Wang Xinyu Wu Zhouyang Wang Yue Ma |
author_sort | Can Wang |
collection | DOAJ |
description | In this paper, we propose to use brain-computer interface (BCI) to control a lower-limb exoskeleton. The exoskeleton follows the wearer's motion intention through decoding of electroencephalography (EEG) signals and multi-modal cognition. Motion patterns as standing up, sitting down, and walking forward can be performed. We implemented two types of BCIs, one based on steady-state visual evoked potentials, which used canonical correlation analysis to extract the frequency the subject focused on. The other BCI is based on motor imagery, where the common spatial patterns method was employed to extract the features from the EEG signal. Then, the features were classified by support vector machine to recognize the intention of the subject. We invited four healthy subjects to participate in the experiments, including off-line and online. The off-line experiments trained the classifier and then were used online to test the performance of the BCI controlled exoskeleton system. The results showed high accuracy rate in motion intention classification tasks for both BCIs. |
first_indexed | 2024-12-14T00:13:17Z |
format | Article |
id | doaj.art-58b217ca868f4ec1b7e09920156f1bdc |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-14T00:13:17Z |
publishDate | 2018-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-58b217ca868f4ec1b7e09920156f1bdc2022-12-21T23:25:40ZengIEEEIEEE Access2169-35362018-01-016385243853410.1109/ACCESS.2018.28536288405532Implementation of a Brain-Computer Interface on a Lower-Limb ExoskeletonCan Wang0https://orcid.org/0000-0002-0914-3994Xinyu Wu1https://orcid.org/0000-0001-6130-7821Zhouyang Wang2Yue Ma3https://orcid.org/0000-0003-0828-8371Guangdong Provincial Key Laboratory of Robotics and Intelligent System, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, ChinaGuangdong Provincial Key Laboratory of Robotics and Intelligent System, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, ChinaShenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Beijing, ChinaShenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Beijing, ChinaIn this paper, we propose to use brain-computer interface (BCI) to control a lower-limb exoskeleton. The exoskeleton follows the wearer's motion intention through decoding of electroencephalography (EEG) signals and multi-modal cognition. Motion patterns as standing up, sitting down, and walking forward can be performed. We implemented two types of BCIs, one based on steady-state visual evoked potentials, which used canonical correlation analysis to extract the frequency the subject focused on. The other BCI is based on motor imagery, where the common spatial patterns method was employed to extract the features from the EEG signal. Then, the features were classified by support vector machine to recognize the intention of the subject. We invited four healthy subjects to participate in the experiments, including off-line and online. The off-line experiments trained the classifier and then were used online to test the performance of the BCI controlled exoskeleton system. The results showed high accuracy rate in motion intention classification tasks for both BCIs.https://ieeexplore.ieee.org/document/8405532/Brain-computer interfacehuman robot interactionmulti-modal robotic cognition |
spellingShingle | Can Wang Xinyu Wu Zhouyang Wang Yue Ma Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton IEEE Access Brain-computer interface human robot interaction multi-modal robotic cognition |
title | Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton |
title_full | Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton |
title_fullStr | Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton |
title_full_unstemmed | Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton |
title_short | Implementation of a Brain-Computer Interface on a Lower-Limb Exoskeleton |
title_sort | implementation of a brain computer interface on a lower limb exoskeleton |
topic | Brain-computer interface human robot interaction multi-modal robotic cognition |
url | https://ieeexplore.ieee.org/document/8405532/ |
work_keys_str_mv | AT canwang implementationofabraincomputerinterfaceonalowerlimbexoskeleton AT xinyuwu implementationofabraincomputerinterfaceonalowerlimbexoskeleton AT zhouyangwang implementationofabraincomputerinterfaceonalowerlimbexoskeleton AT yuema implementationofabraincomputerinterfaceonalowerlimbexoskeleton |