Multi-scopic neuro-cognitive adaptation for legged locomotion robots

Abstract Dynamic locomotion is realized through a simultaneous integration of adaptability and optimality. This article proposes a neuro-cognitive model for a multi-legged locomotion robot that can seamlessly integrate multi-modal sensing, ecological perception, and cognition through the coordinatio...

Full description

Bibliographic Details
Main Authors: Azhar Aulia Saputra, Kazuyoshi Wada, Shiro Masuda, Naoyuki Kubota
Format: Article
Language:English
Published: Nature Portfolio 2022-09-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-022-19599-2
_version_ 1811200634138918912
author Azhar Aulia Saputra
Kazuyoshi Wada
Shiro Masuda
Naoyuki Kubota
author_facet Azhar Aulia Saputra
Kazuyoshi Wada
Shiro Masuda
Naoyuki Kubota
author_sort Azhar Aulia Saputra
collection DOAJ
description Abstract Dynamic locomotion is realized through a simultaneous integration of adaptability and optimality. This article proposes a neuro-cognitive model for a multi-legged locomotion robot that can seamlessly integrate multi-modal sensing, ecological perception, and cognition through the coordination of interoceptive and exteroceptive sensory information. Importantly, cognitive models can be discussed as micro-, meso-, and macro-scopic; these concepts correspond to sensing, perception, and cognition; and short-, medium-, and long-term adaptation (in terms of ecological psychology). The proposed neuro-cognitive model integrates these intelligent functions from a multi-scopic point of view. Macroscopic-level presents an attention mechanism with short-term adaptive locomotion control conducted by a lower-level sensorimotor coordination-based model. Macrosopic-level serves environmental cognitive map featuring higher-level behavior planning. Mesoscopic level shows integration between the microscopic and macroscopic approaches, enabling the model to reconstruct a map and conduct localization using bottom-up facial environmental information and top-down map information, generating intention towards the ultimate goal at the macroscopic level. The experiments demonstrated that adaptability and optimality of multi-legged locomotion could be achieved using the proposed multi-scale neuro-cognitive model, from short to long-term adaptation, with efficient computational usage. Future research directions can be implemented not only in robotics contexts but also in the context of interdisciplinary studies incorporating cognitive science and ecological psychology.
first_indexed 2024-04-12T02:06:54Z
format Article
id doaj.art-5a74f1bff67c4c7e9460cbd58c784b3e
institution Directory Open Access Journal
issn 2045-2322
language English
last_indexed 2024-04-12T02:06:54Z
publishDate 2022-09-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj.art-5a74f1bff67c4c7e9460cbd58c784b3e2022-12-22T03:52:30ZengNature PortfolioScientific Reports2045-23222022-09-0112111210.1038/s41598-022-19599-2Multi-scopic neuro-cognitive adaptation for legged locomotion robotsAzhar Aulia Saputra0Kazuyoshi Wada1Shiro Masuda2Naoyuki Kubota3Graduate School of Systems Design, Tokyo Metropolitan UniversityGraduate School of Systems Design, Tokyo Metropolitan UniversityGraduate School of Systems Design, Tokyo Metropolitan UniversityGraduate School of Systems Design, Tokyo Metropolitan UniversityAbstract Dynamic locomotion is realized through a simultaneous integration of adaptability and optimality. This article proposes a neuro-cognitive model for a multi-legged locomotion robot that can seamlessly integrate multi-modal sensing, ecological perception, and cognition through the coordination of interoceptive and exteroceptive sensory information. Importantly, cognitive models can be discussed as micro-, meso-, and macro-scopic; these concepts correspond to sensing, perception, and cognition; and short-, medium-, and long-term adaptation (in terms of ecological psychology). The proposed neuro-cognitive model integrates these intelligent functions from a multi-scopic point of view. Macroscopic-level presents an attention mechanism with short-term adaptive locomotion control conducted by a lower-level sensorimotor coordination-based model. Macrosopic-level serves environmental cognitive map featuring higher-level behavior planning. Mesoscopic level shows integration between the microscopic and macroscopic approaches, enabling the model to reconstruct a map and conduct localization using bottom-up facial environmental information and top-down map information, generating intention towards the ultimate goal at the macroscopic level. The experiments demonstrated that adaptability and optimality of multi-legged locomotion could be achieved using the proposed multi-scale neuro-cognitive model, from short to long-term adaptation, with efficient computational usage. Future research directions can be implemented not only in robotics contexts but also in the context of interdisciplinary studies incorporating cognitive science and ecological psychology.https://doi.org/10.1038/s41598-022-19599-2
spellingShingle Azhar Aulia Saputra
Kazuyoshi Wada
Shiro Masuda
Naoyuki Kubota
Multi-scopic neuro-cognitive adaptation for legged locomotion robots
Scientific Reports
title Multi-scopic neuro-cognitive adaptation for legged locomotion robots
title_full Multi-scopic neuro-cognitive adaptation for legged locomotion robots
title_fullStr Multi-scopic neuro-cognitive adaptation for legged locomotion robots
title_full_unstemmed Multi-scopic neuro-cognitive adaptation for legged locomotion robots
title_short Multi-scopic neuro-cognitive adaptation for legged locomotion robots
title_sort multi scopic neuro cognitive adaptation for legged locomotion robots
url https://doi.org/10.1038/s41598-022-19599-2
work_keys_str_mv AT azharauliasaputra multiscopicneurocognitiveadaptationforleggedlocomotionrobots
AT kazuyoshiwada multiscopicneurocognitiveadaptationforleggedlocomotionrobots
AT shiromasuda multiscopicneurocognitiveadaptationforleggedlocomotionrobots
AT naoyukikubota multiscopicneurocognitiveadaptationforleggedlocomotionrobots