Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals

Abstract Background Although cochlear implants can restore auditory inputs to deafferented auditory cortices, the quality of the sound signal transmitted to the brain is severely degraded, limiting functional outcomes in terms of speech perception and emotion perception. The latter deficit negativel...

Full description

Bibliographic Details
Main Authors: Sebastien Paquette, Samir Gouin, Alexandre Lehmann
Format: Article
Language:English
Published: BMC 2024-04-01
Series:BMC Neurology
Subjects:
Online Access:https://doi.org/10.1186/s12883-024-03616-0
_version_ 1797209287838138368
author Sebastien Paquette
Samir Gouin
Alexandre Lehmann
author_facet Sebastien Paquette
Samir Gouin
Alexandre Lehmann
author_sort Sebastien Paquette
collection DOAJ
description Abstract Background Although cochlear implants can restore auditory inputs to deafferented auditory cortices, the quality of the sound signal transmitted to the brain is severely degraded, limiting functional outcomes in terms of speech perception and emotion perception. The latter deficit negatively impacts cochlear implant users’ social integration and quality of life; however, emotion perception is not currently part of rehabilitation. Developing rehabilitation programs incorporating emotional cognition requires a deeper understanding of cochlear implant users’ residual emotion perception abilities. Methods To identify the neural underpinnings of these residual abilities, we investigated whether machine learning techniques could be used to identify emotion-specific patterns of neural activity in cochlear implant users. Using existing electroencephalography data from 22 cochlear implant users, we employed a random forest classifier to establish if we could model and subsequently predict from participants’ brain responses the auditory emotions (vocal and musical) presented to them. Results Our findings suggest that consistent emotion-specific biomarkers exist in cochlear implant users, which could be used to develop effective rehabilitation programs incorporating emotion perception training. Conclusions This study highlights the potential of machine learning techniques to improve outcomes for cochlear implant users, particularly in terms of emotion perception.
first_indexed 2024-04-24T09:52:18Z
format Article
id doaj.art-7ddb3da132bc4b6788bd3bd4b87ccc97
institution Directory Open Access Journal
issn 1471-2377
language English
last_indexed 2024-04-24T09:52:18Z
publishDate 2024-04-01
publisher BMC
record_format Article
series BMC Neurology
spelling doaj.art-7ddb3da132bc4b6788bd3bd4b87ccc972024-04-14T11:19:54ZengBMCBMC Neurology1471-23772024-04-012411510.1186/s12883-024-03616-0Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signalsSebastien Paquette0Samir Gouin1Alexandre Lehmann2Psychology Department, Faculty of Arts and Science, Trent UniversityCentre for Research On Brain, Language, and Music (CRBLM), International Laboratory for Brain, Music & Sound Research (BRAMS), Psychology Department, University of MontrealResearch Institute of the McGill University Health Centre (RI-MUHC)Abstract Background Although cochlear implants can restore auditory inputs to deafferented auditory cortices, the quality of the sound signal transmitted to the brain is severely degraded, limiting functional outcomes in terms of speech perception and emotion perception. The latter deficit negatively impacts cochlear implant users’ social integration and quality of life; however, emotion perception is not currently part of rehabilitation. Developing rehabilitation programs incorporating emotional cognition requires a deeper understanding of cochlear implant users’ residual emotion perception abilities. Methods To identify the neural underpinnings of these residual abilities, we investigated whether machine learning techniques could be used to identify emotion-specific patterns of neural activity in cochlear implant users. Using existing electroencephalography data from 22 cochlear implant users, we employed a random forest classifier to establish if we could model and subsequently predict from participants’ brain responses the auditory emotions (vocal and musical) presented to them. Results Our findings suggest that consistent emotion-specific biomarkers exist in cochlear implant users, which could be used to develop effective rehabilitation programs incorporating emotion perception training. Conclusions This study highlights the potential of machine learning techniques to improve outcomes for cochlear implant users, particularly in terms of emotion perception.https://doi.org/10.1186/s12883-024-03616-0Cochlear implantEmotion perceptionMachine learning
spellingShingle Sebastien Paquette
Samir Gouin
Alexandre Lehmann
Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
BMC Neurology
Cochlear implant
Emotion perception
Machine learning
title Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
title_full Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
title_fullStr Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
title_full_unstemmed Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
title_short Improving emotion perception in cochlear implant users: insights from machine learning analysis of EEG signals
title_sort improving emotion perception in cochlear implant users insights from machine learning analysis of eeg signals
topic Cochlear implant
Emotion perception
Machine learning
url https://doi.org/10.1186/s12883-024-03616-0
work_keys_str_mv AT sebastienpaquette improvingemotionperceptionincochlearimplantusersinsightsfrommachinelearninganalysisofeegsignals
AT samirgouin improvingemotionperceptionincochlearimplantusersinsightsfrommachinelearninganalysisofeegsignals
AT alexandrelehmann improvingemotionperceptionincochlearimplantusersinsightsfrommachinelearninganalysisofeegsignals