Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer
When learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrum...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-03-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/22/5/2074 |
_version_ | 1797473686781952000 |
---|---|
author | Jin Kuroda Gou Koutaki |
author_facet | Jin Kuroda Gou Koutaki |
author_sort | Jin Kuroda |
collection | DOAJ |
description | When learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrument, and provide specific guidance. However, it is difficult to acquire the control parameters of wind instruments (e.g., saxophone or flute) such as flow and angle between the player and the musical instrument, since it is not possible to place sensors into the mouth. In this paper, we propose a sensorless control parameter estimation system based on the recorded sound of a wind instrument using only machine learning. In the machine learning framework, many training samples that have both sound and correct labels are required. Therefore, we generated training samples using a robotic performer. This has two advantages: (1) it is easy to obtain many training samples with exhaustive control parameters, and (2) we can use the correct labels as the given control parameters of the robot. In addition to the samples generated by the robot, some human performance data were also used for training to construct an estimation model that enhanced the feature differences between robot and human performance. Finally, a flute control parameter estimation system was developed, and its estimation accuracy for eight novice flute players was evaluated using the Spearman’s rank correlation coefficient. The experimental results showed that the proposed system was able to estimate human control parameters with high accuracy. |
first_indexed | 2024-03-09T20:20:56Z |
format | Article |
id | doaj.art-f7c9c400886c467382b49c394e4ca552 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-03-09T20:20:56Z |
publishDate | 2022-03-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-f7c9c400886c467382b49c394e4ca5522023-11-23T23:50:42ZengMDPI AGSensors1424-82202022-03-01225207410.3390/s22052074Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic PerformerJin Kuroda0Gou Koutaki1Department of Computer Science and Electrical Engineering, Kumamoto University, Kumamoto 860-8555, JapanDepartment of Computer Science and Electrical Engineering, Kumamoto University, Kumamoto 860-8555, JapanWhen learning to play a musical instrument, it is important to improve the quality of self-practice. Many systems have been developed to assist practice. Some practice assistance systems use special sensors (pressure, flow, and motion sensors) to acquire the control parameters of the musical instrument, and provide specific guidance. However, it is difficult to acquire the control parameters of wind instruments (e.g., saxophone or flute) such as flow and angle between the player and the musical instrument, since it is not possible to place sensors into the mouth. In this paper, we propose a sensorless control parameter estimation system based on the recorded sound of a wind instrument using only machine learning. In the machine learning framework, many training samples that have both sound and correct labels are required. Therefore, we generated training samples using a robotic performer. This has two advantages: (1) it is easy to obtain many training samples with exhaustive control parameters, and (2) we can use the correct labels as the given control parameters of the robot. In addition to the samples generated by the robot, some human performance data were also used for training to construct an estimation model that enhanced the feature differences between robot and human performance. Finally, a flute control parameter estimation system was developed, and its estimation accuracy for eight novice flute players was evaluated using the Spearman’s rank correlation coefficient. The experimental results showed that the proposed system was able to estimate human control parameters with high accuracy.https://www.mdpi.com/1424-8220/22/5/2074parameter estimationflute-playing robotneural networkmultilayer perceptronlearning to rank |
spellingShingle | Jin Kuroda Gou Koutaki Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer Sensors parameter estimation flute-playing robot neural network multilayer perceptron learning to rank |
title | Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer |
title_full | Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer |
title_fullStr | Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer |
title_full_unstemmed | Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer |
title_short | Sensing Control Parameters of Flute from Microphone Sound Based on Machine Learning from Robotic Performer |
title_sort | sensing control parameters of flute from microphone sound based on machine learning from robotic performer |
topic | parameter estimation flute-playing robot neural network multilayer perceptron learning to rank |
url | https://www.mdpi.com/1424-8220/22/5/2074 |
work_keys_str_mv | AT jinkuroda sensingcontrolparametersofflutefrommicrophonesoundbasedonmachinelearningfromroboticperformer AT goukoutaki sensingcontrolparametersofflutefrommicrophonesoundbasedonmachinelearningfromroboticperformer |