An adaptive control framework based multi-modal information-driven dance composition model for musical robots

Currently, most robot dances are pre-compiled, the requirement of manual adjustment of relevant parameters and meta-action to change the dancing to another type of music would greatly reduce its function. To overcome the gap, this study proposed a dance composition model for mobile robots based on m...

Full description

Bibliographic Details
Main Authors: Fumei Xu, Yu Xia, Xiaorun Wu
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-10-01
Series:Frontiers in Neurorobotics
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnbot.2023.1270652/full
_version_ 1797663379203031040
author Fumei Xu
Yu Xia
Xiaorun Wu
author_facet Fumei Xu
Yu Xia
Xiaorun Wu
author_sort Fumei Xu
collection DOAJ
description Currently, most robot dances are pre-compiled, the requirement of manual adjustment of relevant parameters and meta-action to change the dancing to another type of music would greatly reduce its function. To overcome the gap, this study proposed a dance composition model for mobile robots based on multimodal information. The model consists of three parts. (1) Extraction of multimodal information. The temporal structure feature method of structure analysis framework is used to divide audio music files into music structures; then, a hierarchical emotion detection framework is used to extract information (rhythm, emotion, tension, etc.) for each segmented music structure; calculating the safety of the current car and surrounding objects in motion; finally, extracting the stage color of the robot's location, corresponding to the relevant atmosphere emotions. (2) Initialize the dance library. Dance composition is divided into four categories based on the classification of music emotions; in addition, each type of dance composition is divided into skilled composition and general dance composition. (3) The total path length can be obtained by combining multimodal information based on different emotions, initial speeds, and music structure periods; then, target point planning can be carried out based on the specific dance composition selected. An adaptive control framework based on the Cerebellar Model Articulation Controller (CMAC) and compensation controllers is used to track the target point trajectory, and finally, the selected dance composition is formed. Mobile robot dance composition provides a new method and concept for humanoid robot dance composition.
first_indexed 2024-03-11T19:13:39Z
format Article
id doaj.art-fcfbbf2374d8492dbf5b152041040689
institution Directory Open Access Journal
issn 1662-5218
language English
last_indexed 2024-03-11T19:13:39Z
publishDate 2023-10-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Neurorobotics
spelling doaj.art-fcfbbf2374d8492dbf5b1520410406892023-10-09T10:23:23ZengFrontiers Media S.A.Frontiers in Neurorobotics1662-52182023-10-011710.3389/fnbot.2023.12706521270652An adaptive control framework based multi-modal information-driven dance composition model for musical robotsFumei Xu0Yu Xia1Xiaorun Wu2School of Music, Jiangxi Normal University, Nanchang, Jiangxi, ChinaSchool of Aviation Services and Music, Nanchang Hangkong University, Nanchang, Jiangxi, ChinaSchool of Information Engineering, Nanchang Hangkong University, Nanchang, Jiangxi, ChinaCurrently, most robot dances are pre-compiled, the requirement of manual adjustment of relevant parameters and meta-action to change the dancing to another type of music would greatly reduce its function. To overcome the gap, this study proposed a dance composition model for mobile robots based on multimodal information. The model consists of three parts. (1) Extraction of multimodal information. The temporal structure feature method of structure analysis framework is used to divide audio music files into music structures; then, a hierarchical emotion detection framework is used to extract information (rhythm, emotion, tension, etc.) for each segmented music structure; calculating the safety of the current car and surrounding objects in motion; finally, extracting the stage color of the robot's location, corresponding to the relevant atmosphere emotions. (2) Initialize the dance library. Dance composition is divided into four categories based on the classification of music emotions; in addition, each type of dance composition is divided into skilled composition and general dance composition. (3) The total path length can be obtained by combining multimodal information based on different emotions, initial speeds, and music structure periods; then, target point planning can be carried out based on the specific dance composition selected. An adaptive control framework based on the Cerebellar Model Articulation Controller (CMAC) and compensation controllers is used to track the target point trajectory, and finally, the selected dance composition is formed. Mobile robot dance composition provides a new method and concept for humanoid robot dance composition.https://www.frontiersin.org/articles/10.3389/fnbot.2023.1270652/fullCMACrobot trajectorymultimodal informationrobot dancerobot simulation
spellingShingle Fumei Xu
Yu Xia
Xiaorun Wu
An adaptive control framework based multi-modal information-driven dance composition model for musical robots
Frontiers in Neurorobotics
CMAC
robot trajectory
multimodal information
robot dance
robot simulation
title An adaptive control framework based multi-modal information-driven dance composition model for musical robots
title_full An adaptive control framework based multi-modal information-driven dance composition model for musical robots
title_fullStr An adaptive control framework based multi-modal information-driven dance composition model for musical robots
title_full_unstemmed An adaptive control framework based multi-modal information-driven dance composition model for musical robots
title_short An adaptive control framework based multi-modal information-driven dance composition model for musical robots
title_sort adaptive control framework based multi modal information driven dance composition model for musical robots
topic CMAC
robot trajectory
multimodal information
robot dance
robot simulation
url https://www.frontiersin.org/articles/10.3389/fnbot.2023.1270652/full
work_keys_str_mv AT fumeixu anadaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots
AT yuxia anadaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots
AT xiaorunwu anadaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots
AT fumeixu adaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots
AT yuxia adaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots
AT xiaorunwu adaptivecontrolframeworkbasedmultimodalinformationdrivendancecompositionmodelformusicalrobots