Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control
New technologies for monitoring grip forces during hand and finger movements in non-standard task contexts have provided unprecedented functional insights into somatosensory cognition. Somatosensory cognition is the basis of our ability to manipulate and transform objects of the physical world and t...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-01-01
|
Series: | Bioengineering |
Subjects: | |
Online Access: | https://www.mdpi.com/2306-5354/10/1/59 |
_version_ | 1797445976994086912 |
---|---|
author | Rongrong Liu John Wandeto Florent Nageotte Philippe Zanne Michel de Mathelin Birgitta Dresp-Langley |
author_facet | Rongrong Liu John Wandeto Florent Nageotte Philippe Zanne Michel de Mathelin Birgitta Dresp-Langley |
author_sort | Rongrong Liu |
collection | DOAJ |
description | New technologies for monitoring grip forces during hand and finger movements in non-standard task contexts have provided unprecedented functional insights into somatosensory cognition. Somatosensory cognition is the basis of our ability to manipulate and transform objects of the physical world and to grasp them with the right amount of force. In previous work, the wireless tracking of grip-force signals recorded from biosensors in the palm of the human hand has permitted us to unravel some of the functional synergies that underlie perceptual and motor learning under conditions of non-standard and essentially unreliable sensory input. This paper builds on this previous work and discusses further, functionally motivated, analyses of individual grip-force data in manual robot control. Grip forces were recorded from various loci in the dominant and non-dominant hands of individuals with wearable wireless sensor technology. Statistical analyses bring to the fore skill-specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert in manual robot control. A brain-inspired neural network model that uses the output metric of a self-organizing pap with unsupervised winner-take-all learning was run on the sensor output from both hands of each user. The neural network metric expresses the difference between an input representation and its model representation at any given moment in time and reliably captures the differences between novice and expert performance in terms of grip-force variability.Functionally motivated spatiotemporal analysis of individual average grip forces, computed for time windows of constant size in the output of a restricted amount of task-relevant sensors in the dominant (preferred) hand, reveal finger-specific synergies reflecting robotic task skill. The analyses lead the way towards grip-force monitoring in real time. This will permit tracking task skill evolution in trainees, or identify individual proficiency levels in human robot-interaction, which represents unprecedented challenges for perceptual and motor adaptation in environmental contexts of high sensory uncertainty. Cross-disciplinary insights from systems neuroscience and cognitive behavioral science, and the predictive modeling of operator skills using parsimonious Artificial Intelligence (AI), will contribute towards improving the outcome of new types of surgery, in particular the single-port approaches such as NOTES (Natural Orifice Transluminal Endoscopic Surgery) and SILS (Single-Incision Laparoscopic Surgery). |
first_indexed | 2024-03-09T13:33:54Z |
format | Article |
id | doaj.art-d419840500dc49b3816ea2c4d41f1de8 |
institution | Directory Open Access Journal |
issn | 2306-5354 |
language | English |
last_indexed | 2024-03-09T13:33:54Z |
publishDate | 2023-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Bioengineering |
spelling | doaj.art-d419840500dc49b3816ea2c4d41f1de82023-11-30T21:14:53ZengMDPI AGBioengineering2306-53542023-01-011015910.3390/bioengineering10010059Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot ControlRongrong Liu0John Wandeto1Florent Nageotte2Philippe Zanne3Michel de Mathelin4Birgitta Dresp-Langley5ICube UMR 7357, University of Strasbourg, 67000 Strasbourg, FranceDepartment of Information Technology, Dedan Kimathi University of Technology, Nyeri 10143, KenyaICube UMR 7357, University of Strasbourg, 67000 Strasbourg, FranceICube UMR 7357, University of Strasbourg, 67000 Strasbourg, FranceICube UMR 7357, University of Strasbourg, 67000 Strasbourg, FranceICube UMR 7357, University of Strasbourg, 67000 Strasbourg, FranceNew technologies for monitoring grip forces during hand and finger movements in non-standard task contexts have provided unprecedented functional insights into somatosensory cognition. Somatosensory cognition is the basis of our ability to manipulate and transform objects of the physical world and to grasp them with the right amount of force. In previous work, the wireless tracking of grip-force signals recorded from biosensors in the palm of the human hand has permitted us to unravel some of the functional synergies that underlie perceptual and motor learning under conditions of non-standard and essentially unreliable sensory input. This paper builds on this previous work and discusses further, functionally motivated, analyses of individual grip-force data in manual robot control. Grip forces were recorded from various loci in the dominant and non-dominant hands of individuals with wearable wireless sensor technology. Statistical analyses bring to the fore skill-specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert in manual robot control. A brain-inspired neural network model that uses the output metric of a self-organizing pap with unsupervised winner-take-all learning was run on the sensor output from both hands of each user. The neural network metric expresses the difference between an input representation and its model representation at any given moment in time and reliably captures the differences between novice and expert performance in terms of grip-force variability.Functionally motivated spatiotemporal analysis of individual average grip forces, computed for time windows of constant size in the output of a restricted amount of task-relevant sensors in the dominant (preferred) hand, reveal finger-specific synergies reflecting robotic task skill. The analyses lead the way towards grip-force monitoring in real time. This will permit tracking task skill evolution in trainees, or identify individual proficiency levels in human robot-interaction, which represents unprecedented challenges for perceptual and motor adaptation in environmental contexts of high sensory uncertainty. Cross-disciplinary insights from systems neuroscience and cognitive behavioral science, and the predictive modeling of operator skills using parsimonious Artificial Intelligence (AI), will contribute towards improving the outcome of new types of surgery, in particular the single-port approaches such as NOTES (Natural Orifice Transluminal Endoscopic Surgery) and SILS (Single-Incision Laparoscopic Surgery).https://www.mdpi.com/2306-5354/10/1/59wearable biosensorshuman grip forcespatiotemporal analysissomatosensory neuronsmotor controlrobotic task expertise |
spellingShingle | Rongrong Liu John Wandeto Florent Nageotte Philippe Zanne Michel de Mathelin Birgitta Dresp-Langley Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control Bioengineering wearable biosensors human grip force spatiotemporal analysis somatosensory neurons motor control robotic task expertise |
title | Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control |
title_full | Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control |
title_fullStr | Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control |
title_full_unstemmed | Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control |
title_short | Spatiotemporal Modeling of Grip Forces Captures Proficiency in Manual Robot Control |
title_sort | spatiotemporal modeling of grip forces captures proficiency in manual robot control |
topic | wearable biosensors human grip force spatiotemporal analysis somatosensory neurons motor control robotic task expertise |
url | https://www.mdpi.com/2306-5354/10/1/59 |
work_keys_str_mv | AT rongrongliu spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol AT johnwandeto spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol AT florentnageotte spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol AT philippezanne spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol AT micheldemathelin spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol AT birgittadresplangley spatiotemporalmodelingofgripforcescapturesproficiencyinmanualrobotcontrol |