Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model
AbstractElectroencephalography (EEG) recordings taken during the perception of music tempo contain information that estimates the tempo of a music piece. If information about this tempo stimulus in EEG recordings can be extracted and classified, it can be effectively used to construct a music‐based...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Electronics and Telecommunications Research Institute (ETRI)
2021-11-01
|
Series: | ETRI Journal |
Subjects: | |
Online Access: | https://doi.org/10.4218/etrij.2021-0174 |
_version_ | 1819102609097621504 |
---|---|
author | Gi Yong Lee Min‐Soo Kim Hyoung‐Gook Kim |
author_facet | Gi Yong Lee Min‐Soo Kim Hyoung‐Gook Kim |
author_sort | Gi Yong Lee |
collection | DOAJ |
description | AbstractElectroencephalography (EEG) recordings taken during the perception of music tempo contain information that estimates the tempo of a music piece. If information about this tempo stimulus in EEG recordings can be extracted and classified, it can be effectively used to construct a music‐based brain–computer interface. This study proposes a novel convolutional recurrent attention model (CRAM) to extract and classify features corresponding to tempo stimuli from EEG recordings of listeners who listened with concentration to the tempo of musics. The proposed CRAM is composed of six modules, namely, network inputs, two‐dimensional convolutional bidirectional gated recurrent unit‐based sample encoder, sample‐level intuitive attention, segment encoder, segment‐level intuitive attention, and softmax layer, to effectively model spatiotemporal features and improve the classification accuracy of tempo stimuli. To evaluate the proposed method's performance, we conducted experiments on two benchmark datasets. The proposed method achieves promising results, outperforming recent methods. |
first_indexed | 2024-12-22T01:37:17Z |
format | Article |
id | doaj.art-9cdd21d924184e1f9eb0390001402d37 |
institution | Directory Open Access Journal |
issn | 1225-6463 |
language | English |
last_indexed | 2024-12-22T01:37:17Z |
publishDate | 2021-11-01 |
publisher | Electronics and Telecommunications Research Institute (ETRI) |
record_format | Article |
series | ETRI Journal |
spelling | doaj.art-9cdd21d924184e1f9eb0390001402d372022-12-21T18:43:20ZengElectronics and Telecommunications Research Institute (ETRI)ETRI Journal1225-64632021-11-014361081109210.4218/etrij.2021-017410.4218/etrij.2021-0174Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention modelGi Yong LeeMin‐Soo KimHyoung‐Gook KimAbstractElectroencephalography (EEG) recordings taken during the perception of music tempo contain information that estimates the tempo of a music piece. If information about this tempo stimulus in EEG recordings can be extracted and classified, it can be effectively used to construct a music‐based brain–computer interface. This study proposes a novel convolutional recurrent attention model (CRAM) to extract and classify features corresponding to tempo stimuli from EEG recordings of listeners who listened with concentration to the tempo of musics. The proposed CRAM is composed of six modules, namely, network inputs, two‐dimensional convolutional bidirectional gated recurrent unit‐based sample encoder, sample‐level intuitive attention, segment encoder, segment‐level intuitive attention, and softmax layer, to effectively model spatiotemporal features and improve the classification accuracy of tempo stimuli. To evaluate the proposed method's performance, we conducted experiments on two benchmark datasets. The proposed method achieves promising results, outperforming recent methods.https://doi.org/10.4218/etrij.2021-0174attention mechanismconvolutional recurrent neural networkelectroencephalographyspatiotemporal featurestempo stimuli classification |
spellingShingle | Gi Yong Lee Min‐Soo Kim Hyoung‐Gook Kim Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model ETRI Journal attention mechanism convolutional recurrent neural network electroencephalography spatiotemporal features tempo stimuli classification |
title | Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
title_full | Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
title_fullStr | Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
title_full_unstemmed | Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
title_short | Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
title_sort | extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model |
topic | attention mechanism convolutional recurrent neural network electroencephalography spatiotemporal features tempo stimuli classification |
url | https://doi.org/10.4218/etrij.2021-0174 |
work_keys_str_mv | AT giyonglee extractionandclassificationoftempostimulifromelectroencephalographyrecordingsusingconvolutionalrecurrentattentionmodel AT minsookim extractionandclassificationoftempostimulifromelectroencephalographyrecordingsusingconvolutionalrecurrentattentionmodel AT hyounggookkim extractionandclassificationoftempostimulifromelectroencephalographyrecordingsusingconvolutionalrecurrentattentionmodel |