Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning
Many upper-limb prostheses lack proper wrist rotation functionality, leading to users performing poor compensatory strategies, leading to overuse or abandonment. In this study, we investigate the validity of creating and implementing a data-driven predictive control strategy in object grasping tasks...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2022-01-01
|
Series: | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9698069/ |
_version_ | 1797805094639501312 |
---|---|
author | Maxim Karrenbach David Boe Astrini Sie Rob Bennett Eric Rombokas |
author_facet | Maxim Karrenbach David Boe Astrini Sie Rob Bennett Eric Rombokas |
author_sort | Maxim Karrenbach |
collection | DOAJ |
description | Many upper-limb prostheses lack proper wrist rotation functionality, leading to users performing poor compensatory strategies, leading to overuse or abandonment. In this study, we investigate the validity of creating and implementing a data-driven predictive control strategy in object grasping tasks performed in virtual reality. We propose the idea of using gaze-centered vision to predict the wrist rotations of a user and implement a user study to investigate the impact of using this predictive control. We demonstrate that using this vision-based predictive system leads to a decrease in compensatory movement in the shoulder, as well as task completion time. We discuss the cases in which the virtual prosthesis with the predictive model implemented did and did not make a physical improvement in various arm movements. We also discuss the cognitive value in implementing such predictive control strategies into prosthetic controllers. We find that gaze-centered vision provides information about the intent of the user when performing object reaching and that the performance of prosthetic hands improves greatly when wrist prediction is implemented. Lastly, we address the limitations of this study in the context of both the study itself as well as any future physical implementations. |
first_indexed | 2024-03-13T05:47:00Z |
format | Article |
id | doaj.art-83c7614f9d1f44d98ffa6de89a2edc87 |
institution | Directory Open Access Journal |
issn | 1558-0210 |
language | English |
last_indexed | 2024-03-13T05:47:00Z |
publishDate | 2022-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Transactions on Neural Systems and Rehabilitation Engineering |
spelling | doaj.art-83c7614f9d1f44d98ffa6de89a2edc872023-06-13T20:09:04ZengIEEEIEEE Transactions on Neural Systems and Rehabilitation Engineering1558-02102022-01-013034034910.1109/TNSRE.2022.31477729698069Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep LearningMaxim Karrenbach0https://orcid.org/0000-0003-1854-1879David Boe1Astrini Sie2Rob Bennetthttps://orcid.org/0000-0003-3913-9491Eric Rombokas3https://orcid.org/0000-0001-8523-1913Department of Electrical and Computer Engineering, University of Washington, Seattle, WA, USADepartment of Mechanical Engineering, University of Washington, Seattle, WA, USADepartment of Electrical and Computer Engineering, University of Washington, Seattle, WA, USADepartment of Mechanical Engineering, University of Washington, Seattle, WA, USAMany upper-limb prostheses lack proper wrist rotation functionality, leading to users performing poor compensatory strategies, leading to overuse or abandonment. In this study, we investigate the validity of creating and implementing a data-driven predictive control strategy in object grasping tasks performed in virtual reality. We propose the idea of using gaze-centered vision to predict the wrist rotations of a user and implement a user study to investigate the impact of using this predictive control. We demonstrate that using this vision-based predictive system leads to a decrease in compensatory movement in the shoulder, as well as task completion time. We discuss the cases in which the virtual prosthesis with the predictive model implemented did and did not make a physical improvement in various arm movements. We also discuss the cognitive value in implementing such predictive control strategies into prosthetic controllers. We find that gaze-centered vision provides information about the intent of the user when performing object reaching and that the performance of prosthetic hands improves greatly when wrist prediction is implemented. Lastly, we address the limitations of this study in the context of both the study itself as well as any future physical implementations.https://ieeexplore.ieee.org/document/9698069/Gaze-centered visionpredictive controldeep learningcompensatory strategies |
spellingShingle | Maxim Karrenbach David Boe Astrini Sie Rob Bennett Eric Rombokas Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning IEEE Transactions on Neural Systems and Rehabilitation Engineering Gaze-centered vision predictive control deep learning compensatory strategies |
title | Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning |
title_full | Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning |
title_fullStr | Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning |
title_full_unstemmed | Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning |
title_short | Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning |
title_sort | improving automatic control of upper limb prosthesis wrists using gaze centered eye tracking and deep learning |
topic | Gaze-centered vision predictive control deep learning compensatory strategies |
url | https://ieeexplore.ieee.org/document/9698069/ |
work_keys_str_mv | AT maximkarrenbach improvingautomaticcontrolofupperlimbprosthesiswristsusinggazecenteredeyetrackinganddeeplearning AT davidboe improvingautomaticcontrolofupperlimbprosthesiswristsusinggazecenteredeyetrackinganddeeplearning AT astrinisie improvingautomaticcontrolofupperlimbprosthesiswristsusinggazecenteredeyetrackinganddeeplearning AT robbennett improvingautomaticcontrolofupperlimbprosthesiswristsusinggazecenteredeyetrackinganddeeplearning AT ericrombokas improvingautomaticcontrolofupperlimbprosthesiswristsusinggazecenteredeyetrackinganddeeplearning |