Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events
Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for alig...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2019-03-01
|
Series: | Frontiers in Neurorobotics |
Subjects: | |
Online Access: | https://www.frontiersin.org/article/10.3389/fnbot.2019.00005/full |
_version_ | 1818988595880394752 |
---|---|
author | Ganna Pugach Alexandre Pitti Olga Tolochko Philippe Gaussier |
author_facet | Ganna Pugach Alexandre Pitti Olga Tolochko Philippe Gaussier |
author_sort | Ganna Pugach |
collection | DOAJ |
description | Representing objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand- and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space. |
first_indexed | 2024-12-20T19:25:05Z |
format | Article |
id | doaj.art-65800bc91d094dd88bdabed04abe7662 |
institution | Directory Open Access Journal |
issn | 1662-5218 |
language | English |
last_indexed | 2024-12-20T19:25:05Z |
publishDate | 2019-03-01 |
publisher | Frontiers Media S.A. |
record_format | Article |
series | Frontiers in Neurorobotics |
spelling | doaj.art-65800bc91d094dd88bdabed04abe76622022-12-21T19:28:53ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182019-03-011310.3389/fnbot.2019.00005420943Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched EventsGanna Pugach0Alexandre Pitti1Olga Tolochko2Philippe Gaussier3ETIS Laboratory, University Paris-Seine, CNRS UMR 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, FranceETIS Laboratory, University Paris-Seine, CNRS UMR 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, FranceFaculty of Electric Power Engineering and Automation, National Technical University of Ukraine Kyiv Polytechnic Institute, Kyiv, UkraineETIS Laboratory, University Paris-Seine, CNRS UMR 8051, University of Cergy-Pontoise, ENSEA, Cergy-Pontoise, FranceRepresenting objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, arm-, or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand- and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space.https://www.frontiersin.org/article/10.3389/fnbot.2019.00005/fullbody schemamultimodal integrationartificial skinparietal cortexgain-field neuronsperi-personal space |
spellingShingle | Ganna Pugach Alexandre Pitti Olga Tolochko Philippe Gaussier Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events Frontiers in Neurorobotics body schema multimodal integration artificial skin parietal cortex gain-field neurons peri-personal space |
title | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_full | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_fullStr | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_full_unstemmed | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_short | Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events |
title_sort | brain inspired coding of robot body schema through visuo motor integration of touched events |
topic | body schema multimodal integration artificial skin parietal cortex gain-field neurons peri-personal space |
url | https://www.frontiersin.org/article/10.3389/fnbot.2019.00005/full |
work_keys_str_mv | AT gannapugach braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT alexandrepitti braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT olgatolochko braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents AT philippegaussier braininspiredcodingofrobotbodyschemathroughvisuomotorintegrationoftouchedevents |