Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song

Rhythm is a key feature of music and language, but the way rhythm unfolds within each domain differs. Music induces perception of a beat, a regular repeating pulse spaced by roughly equal durations, whereas speech does not have the same isochronous framework. Although rhythmic regularity is a defini...

Full description

Bibliographic Details
Main Authors: Chu Yi Yu, Anne Cabildo, Jessica A. Grahn, Christina M. Vanden Bosch der Nederlanden
Format: Article
Language:English
Published: Frontiers Media S.A. 2023-05-01
Series:Frontiers in Psychology
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1167003/full
_version_ 1797819797734424576
author Chu Yi Yu
Chu Yi Yu
Anne Cabildo
Jessica A. Grahn
Jessica A. Grahn
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
author_facet Chu Yi Yu
Chu Yi Yu
Anne Cabildo
Jessica A. Grahn
Jessica A. Grahn
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
author_sort Chu Yi Yu
collection DOAJ
description Rhythm is a key feature of music and language, but the way rhythm unfolds within each domain differs. Music induces perception of a beat, a regular repeating pulse spaced by roughly equal durations, whereas speech does not have the same isochronous framework. Although rhythmic regularity is a defining feature of music and language, it is difficult to derive acoustic indices of the differences in rhythmic regularity between domains. The current study examined whether participants could provide subjective ratings of rhythmic regularity for acoustically matched (syllable-, tempo-, and contour-matched) and acoustically unmatched (varying in tempo, syllable number, semantics, and contour) exemplars of speech and song. We used subjective ratings to index the presence or absence of an underlying beat and correlated ratings with stimulus features to identify acoustic metrics of regularity. Experiment 1 highlighted that ratings based on the term “rhythmic regularity” did not result in consistent definitions of regularity across participants, with opposite ratings for participants who adopted a beat-based definition (song greater than speech), a normal-prosody definition (speech greater than song), or an unclear definition (no difference). Experiment 2 defined rhythmic regularity as how easy it would be to tap or clap to the utterances. Participants rated song as easier to clap or tap to than speech for both acoustically matched and unmatched datasets. Subjective regularity ratings from Experiment 2 illustrated that stimuli with longer syllable durations and with less spectral flux were rated as more rhythmically regular across domains. Our findings demonstrate that rhythmic regularity distinguishes speech from song and several key acoustic features can be used to predict listeners’ perception of rhythmic regularity within and across domains as well.
first_indexed 2024-03-13T09:27:54Z
format Article
id doaj.art-1b21354f7bc44eeba5217fe94dbb8bdc
institution Directory Open Access Journal
issn 1664-1078
language English
last_indexed 2024-03-13T09:27:54Z
publishDate 2023-05-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Psychology
spelling doaj.art-1b21354f7bc44eeba5217fe94dbb8bdc2023-05-26T04:33:26ZengFrontiers Media S.A.Frontiers in Psychology1664-10782023-05-011410.3389/fpsyg.2023.11670031167003Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and songChu Yi Yu0Chu Yi Yu1Anne Cabildo2Jessica A. Grahn3Jessica A. Grahn4Christina M. Vanden Bosch der Nederlanden5Christina M. Vanden Bosch der Nederlanden6Christina M. Vanden Bosch der Nederlanden7The Brain and Mind Institute, Western University, London, ON, CanadaDepartment of Psychology, Western University, London, ON, CanadaDepartment of Psychology, University of Toronto, Mississauga, ON, CanadaThe Brain and Mind Institute, Western University, London, ON, CanadaDepartment of Psychology, Western University, London, ON, CanadaThe Brain and Mind Institute, Western University, London, ON, CanadaDepartment of Psychology, Western University, London, ON, CanadaDepartment of Psychology, University of Toronto, Mississauga, ON, CanadaRhythm is a key feature of music and language, but the way rhythm unfolds within each domain differs. Music induces perception of a beat, a regular repeating pulse spaced by roughly equal durations, whereas speech does not have the same isochronous framework. Although rhythmic regularity is a defining feature of music and language, it is difficult to derive acoustic indices of the differences in rhythmic regularity between domains. The current study examined whether participants could provide subjective ratings of rhythmic regularity for acoustically matched (syllable-, tempo-, and contour-matched) and acoustically unmatched (varying in tempo, syllable number, semantics, and contour) exemplars of speech and song. We used subjective ratings to index the presence or absence of an underlying beat and correlated ratings with stimulus features to identify acoustic metrics of regularity. Experiment 1 highlighted that ratings based on the term “rhythmic regularity” did not result in consistent definitions of regularity across participants, with opposite ratings for participants who adopted a beat-based definition (song greater than speech), a normal-prosody definition (speech greater than song), or an unclear definition (no difference). Experiment 2 defined rhythmic regularity as how easy it would be to tap or clap to the utterances. Participants rated song as easier to clap or tap to than speech for both acoustically matched and unmatched datasets. Subjective regularity ratings from Experiment 2 illustrated that stimuli with longer syllable durations and with less spectral flux were rated as more rhythmically regular across domains. Our findings demonstrate that rhythmic regularity distinguishes speech from song and several key acoustic features can be used to predict listeners’ perception of rhythmic regularity within and across domains as well.https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1167003/fullrhythmic regularitybeatspeechsongmusic information retrievalperiodicity
spellingShingle Chu Yi Yu
Chu Yi Yu
Anne Cabildo
Jessica A. Grahn
Jessica A. Grahn
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
Christina M. Vanden Bosch der Nederlanden
Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
Frontiers in Psychology
rhythmic regularity
beat
speech
song
music information retrieval
periodicity
title Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
title_full Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
title_fullStr Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
title_full_unstemmed Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
title_short Perceived rhythmic regularity is greater for song than speech: examining acoustic correlates of rhythmic regularity in speech and song
title_sort perceived rhythmic regularity is greater for song than speech examining acoustic correlates of rhythmic regularity in speech and song
topic rhythmic regularity
beat
speech
song
music information retrieval
periodicity
url https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1167003/full
work_keys_str_mv AT chuyiyu perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT chuyiyu perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT annecabildo perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT jessicaagrahn perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT jessicaagrahn perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT christinamvandenboschdernederlanden perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT christinamvandenboschdernederlanden perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong
AT christinamvandenboschdernederlanden perceivedrhythmicregularityisgreaterforsongthanspeechexaminingacousticcorrelatesofrhythmicregularityinspeechandsong