EEG-based emotion recognition for real-time applications

Emotions play an important role in human communications and are essential to the understanding of human behavior. A new “emotional dimension” could be added to human-computer interfaces to make them more immersive and intuitive. Human emotions can be recognized from the text, speech, facial expressi...

Full description

Bibliographic Details
Main Author: Liu, Yisi
Other Authors: Olga Sourina
Format: Thesis
Language:English
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10356/55410
_version_ 1811685631887147008
author Liu, Yisi
author2 Olga Sourina
author_facet Olga Sourina
Liu, Yisi
author_sort Liu, Yisi
collection NTU
description Emotions play an important role in human communications and are essential to the understanding of human behavior. A new “emotional dimension” could be added to human-computer interfaces to make them more immersive and intuitive. Human emotions can be recognized from the text, speech, facial expressions or gestures but at the same time a content of the text written by the human, her/his vocal intonation, facial expressions or gestures could be intentionally changed to “hide” the emotions. Thus, if there is a need to know the real “inner” emotions of the human then Electroencephalogram (EEG)-based emotion recognition could be used. EEG gives us an easy and portable way to monitor brain activities via suitable signal processing techniques. Although EEG based emotion recognition received more attention from the research community recently, there are still problems to be solved such as the improvement of accuracy of recognition algorithms, increasing the number of recognized emotions, providing real-time computation time, and reducing the number of electrodes needed. Thus, in my research, I focused on proposing new methods and algorithms to recognize emotions with better accuracy, using fewer electrodes, and targeting real-time applications in entertainment industry, medical applications, neuromarketing, etc. Dimensional emotion models such as 2-dimensional (2D) Valence-Arousal model and 3-dimensional (3D) Valence-Arousal-Dominance model were studied and adopted in my work. Since there are a limited number of benchmark EEG databases with labeled emotions available, a series of experiments to collect affective EEG data were designed and implemented. Audio stimuli such as music and sound clips and visual stimuli such as images were used for emotion induction. The labeled databases of audio stimuli such as International Affective Digitized Sounds system (IADS) and visual stimuli such as International Affective Picture System (IAPS) were used in the experiments. Different devices such as Emotiv and Brain Master 24 were used to collect the EEG data. The questionnaire and Self-Assessment Manikin (SAM) technique were applied to create the labeled affective EEG databases. Nonlinear analysis of EEG signals could be used to quantify the complexity of EEG signals. Nonlinear fractal dimension (FD) features have been proven effective in EEG studies. In my research, the hypothesis that feelings changes (emotions) could be noticed by the EEG as fractal dimension value changes was validated. FD model was studied and FD values were proposed to be used as one of the important features to classify emotions. Higuchi and Box Counting fractal dimension algorithms were implemented and studied with mono-fractal signals and EEG signals. Two types of real-time subject-dependent emotion recognition algorithms were proposed. The first type is a novel real-time SVM-based algorithm that allows recognizing up to 8 emotions (satisfied, happy, surprised, protected, sad, unconcerned, angry and frightened) in the Valence-Arousal-Dominance emotional model. Use of a fractal dimension feature to quantify the nonlinearity of affective EEG signals in combination with statistical and Higher Order Crossings features gives the best accuracy. Only 4 channels from the frontal lobe are compulsory in the recognition, and in such case, the mean accuracy of 2 emotions recognition is 83.73%, the mean accuracy of 3 emotions is 74.36%, the mean accuracy of 4 emotions is 67.90%, the mean accuracy of 5 emotions is 63.07%, the mean accuracy of 6 emotions is 59.30%, the mean accuracy of 7 emotions is 56.24%, and the mean accuracy of 8 emotions is 53.70%. Since the proposed algorithm is subject-dependent, a real-time emotion recognition system with training session is implemented. The proposed algorithm has the following advantages: 1) the number of electrodes needed is reduced; 2) the number of emotions could be recognized is up to 8 emotions with adequate accuracy; 3) the algorithm is used in real time. The second type of algorithms is based on the combination of SVM-based Dominance-Arousal recognition or Dominance recognition and thresholds-based valence recognition. The hypothesis that fractal dimension value could be used as an important index of emotional state assessment is confirmed. In the first case where Dominance-Arousal are recognized by SVM and valence is recognized by thresholds, up to 16 emotions: four different valence levels with each of the high arousal/ high dominance, low arousal/high dominance, high arousal/low dominance, low arousal/low dominance combinations (namely activated/elated, joyful/happy, contempt/hostile, angry/frustrated, anxious/surprised, fascinated/loved, sinful/displeased, embarrassed/fearful, nonchalant/leisurely, relaxed/secure, mildly annoyed/disdainful, selfish/dissatisfied, solemn/quiet, humble/protected, fatigued/sad and bored/depressed) are recognized. The proposed algorithm could be used in applications such as entertainment and e-learning where different levels of valence or more types of emotions are needed to be identified. To achieve good recognition accuracy the algorithm uses all electrodes of the EEG device. The best recognition accuracy of 80.50% was obtained for the recognition of four arousal-dominance combinations and the best accuracy of 65.32% were obtained for the 4 levels valence recognition using the FD-based features. In the second case where dominance is recognized by SVM and valence is recognized by thresholds, up to 9 levels of valence with controlled dominance (high or low) can be differentiated. This algorithm is proposed since the target is applying it in neuromarketing to find out the true feelings of the customers towards the products, and it is found that like/dislike is related with dominance and valence dimensions of an emotion according to the analysis of the self-assessment questionnaires. To achieve good accuracy the algorithm uses all electrodes of the EEG device. For the dominance level recognition, the best accuracy of 89.72% is obtained using the proposed algorithm. For the valence level recognition, the best accuracy is 24.26% for 9 levels recognition, 28.16% for 8 levels recognition, 31.51% for 7 levels recognition, 41.95% for 6 levels recognition, 52.22% for 5 levels recognition, 72.58% for 4 levels recognition, 86.76% for 3 levels recognition and 100% for two levels recognition. Since the algorithms are subject-dependent, a real-time system with training session is implemented. The second type of the proposed algorithms have the following advantages: 1) the number of training sessions needed is reduced. For example, for 16 emotions recognition, only 8 sessions are needed. 2) Up to 16 emotions are recognized with adequate accuracy. 3) Up to 9 levels of valence are recognized which could be used in neuromarketing. 4) The algorithm could be used in real time. The performance of all proposed algorithms was tested with both the developed affective EEG databases and the benchmark affective EEG databases DEAP. Comparisons with other state-of-the-art EEG-based emotion recognition methods are also given. Several real-time applications were designed and implemented with the proposed emotion recognition algorithms such as music therapy, adaptive advertisement, adaptive games, emotional companion, and an emotion-enabled music player.
first_indexed 2024-10-01T04:47:36Z
format Thesis
id ntu-10356/55410
institution Nanyang Technological University
language English
last_indexed 2024-10-01T04:47:36Z
publishDate 2014
record_format dspace
spelling ntu-10356/554102023-07-04T16:07:10Z EEG-based emotion recognition for real-time applications Liu, Yisi Olga Sourina School of Electrical and Electronic Engineering DRNTU::Engineering::Computer science and engineering::Computing methodologies::Pattern recognition DRNTU::Engineering::Electrical and electronic engineering::Electronic systems::Signal processing Emotions play an important role in human communications and are essential to the understanding of human behavior. A new “emotional dimension” could be added to human-computer interfaces to make them more immersive and intuitive. Human emotions can be recognized from the text, speech, facial expressions or gestures but at the same time a content of the text written by the human, her/his vocal intonation, facial expressions or gestures could be intentionally changed to “hide” the emotions. Thus, if there is a need to know the real “inner” emotions of the human then Electroencephalogram (EEG)-based emotion recognition could be used. EEG gives us an easy and portable way to monitor brain activities via suitable signal processing techniques. Although EEG based emotion recognition received more attention from the research community recently, there are still problems to be solved such as the improvement of accuracy of recognition algorithms, increasing the number of recognized emotions, providing real-time computation time, and reducing the number of electrodes needed. Thus, in my research, I focused on proposing new methods and algorithms to recognize emotions with better accuracy, using fewer electrodes, and targeting real-time applications in entertainment industry, medical applications, neuromarketing, etc. Dimensional emotion models such as 2-dimensional (2D) Valence-Arousal model and 3-dimensional (3D) Valence-Arousal-Dominance model were studied and adopted in my work. Since there are a limited number of benchmark EEG databases with labeled emotions available, a series of experiments to collect affective EEG data were designed and implemented. Audio stimuli such as music and sound clips and visual stimuli such as images were used for emotion induction. The labeled databases of audio stimuli such as International Affective Digitized Sounds system (IADS) and visual stimuli such as International Affective Picture System (IAPS) were used in the experiments. Different devices such as Emotiv and Brain Master 24 were used to collect the EEG data. The questionnaire and Self-Assessment Manikin (SAM) technique were applied to create the labeled affective EEG databases. Nonlinear analysis of EEG signals could be used to quantify the complexity of EEG signals. Nonlinear fractal dimension (FD) features have been proven effective in EEG studies. In my research, the hypothesis that feelings changes (emotions) could be noticed by the EEG as fractal dimension value changes was validated. FD model was studied and FD values were proposed to be used as one of the important features to classify emotions. Higuchi and Box Counting fractal dimension algorithms were implemented and studied with mono-fractal signals and EEG signals. Two types of real-time subject-dependent emotion recognition algorithms were proposed. The first type is a novel real-time SVM-based algorithm that allows recognizing up to 8 emotions (satisfied, happy, surprised, protected, sad, unconcerned, angry and frightened) in the Valence-Arousal-Dominance emotional model. Use of a fractal dimension feature to quantify the nonlinearity of affective EEG signals in combination with statistical and Higher Order Crossings features gives the best accuracy. Only 4 channels from the frontal lobe are compulsory in the recognition, and in such case, the mean accuracy of 2 emotions recognition is 83.73%, the mean accuracy of 3 emotions is 74.36%, the mean accuracy of 4 emotions is 67.90%, the mean accuracy of 5 emotions is 63.07%, the mean accuracy of 6 emotions is 59.30%, the mean accuracy of 7 emotions is 56.24%, and the mean accuracy of 8 emotions is 53.70%. Since the proposed algorithm is subject-dependent, a real-time emotion recognition system with training session is implemented. The proposed algorithm has the following advantages: 1) the number of electrodes needed is reduced; 2) the number of emotions could be recognized is up to 8 emotions with adequate accuracy; 3) the algorithm is used in real time. The second type of algorithms is based on the combination of SVM-based Dominance-Arousal recognition or Dominance recognition and thresholds-based valence recognition. The hypothesis that fractal dimension value could be used as an important index of emotional state assessment is confirmed. In the first case where Dominance-Arousal are recognized by SVM and valence is recognized by thresholds, up to 16 emotions: four different valence levels with each of the high arousal/ high dominance, low arousal/high dominance, high arousal/low dominance, low arousal/low dominance combinations (namely activated/elated, joyful/happy, contempt/hostile, angry/frustrated, anxious/surprised, fascinated/loved, sinful/displeased, embarrassed/fearful, nonchalant/leisurely, relaxed/secure, mildly annoyed/disdainful, selfish/dissatisfied, solemn/quiet, humble/protected, fatigued/sad and bored/depressed) are recognized. The proposed algorithm could be used in applications such as entertainment and e-learning where different levels of valence or more types of emotions are needed to be identified. To achieve good recognition accuracy the algorithm uses all electrodes of the EEG device. The best recognition accuracy of 80.50% was obtained for the recognition of four arousal-dominance combinations and the best accuracy of 65.32% were obtained for the 4 levels valence recognition using the FD-based features. In the second case where dominance is recognized by SVM and valence is recognized by thresholds, up to 9 levels of valence with controlled dominance (high or low) can be differentiated. This algorithm is proposed since the target is applying it in neuromarketing to find out the true feelings of the customers towards the products, and it is found that like/dislike is related with dominance and valence dimensions of an emotion according to the analysis of the self-assessment questionnaires. To achieve good accuracy the algorithm uses all electrodes of the EEG device. For the dominance level recognition, the best accuracy of 89.72% is obtained using the proposed algorithm. For the valence level recognition, the best accuracy is 24.26% for 9 levels recognition, 28.16% for 8 levels recognition, 31.51% for 7 levels recognition, 41.95% for 6 levels recognition, 52.22% for 5 levels recognition, 72.58% for 4 levels recognition, 86.76% for 3 levels recognition and 100% for two levels recognition. Since the algorithms are subject-dependent, a real-time system with training session is implemented. The second type of the proposed algorithms have the following advantages: 1) the number of training sessions needed is reduced. For example, for 16 emotions recognition, only 8 sessions are needed. 2) Up to 16 emotions are recognized with adequate accuracy. 3) Up to 9 levels of valence are recognized which could be used in neuromarketing. 4) The algorithm could be used in real time. The performance of all proposed algorithms was tested with both the developed affective EEG databases and the benchmark affective EEG databases DEAP. Comparisons with other state-of-the-art EEG-based emotion recognition methods are also given. Several real-time applications were designed and implemented with the proposed emotion recognition algorithms such as music therapy, adaptive advertisement, adaptive games, emotional companion, and an emotion-enabled music player. Doctor of Philosophy (EEE) 2014-03-03T08:36:17Z 2014-03-03T08:36:17Z 2014 2014 Thesis Liu, Y. (2014). EEG-based emotion recognition for real-time applications. Doctoral thesis, Nanyang Technological University, Singapore. http://hdl.handle.net/10356/55410 en 181 p. application/pdf
spellingShingle DRNTU::Engineering::Computer science and engineering::Computing methodologies::Pattern recognition
DRNTU::Engineering::Electrical and electronic engineering::Electronic systems::Signal processing
Liu, Yisi
EEG-based emotion recognition for real-time applications
title EEG-based emotion recognition for real-time applications
title_full EEG-based emotion recognition for real-time applications
title_fullStr EEG-based emotion recognition for real-time applications
title_full_unstemmed EEG-based emotion recognition for real-time applications
title_short EEG-based emotion recognition for real-time applications
title_sort eeg based emotion recognition for real time applications
topic DRNTU::Engineering::Computer science and engineering::Computing methodologies::Pattern recognition
DRNTU::Engineering::Electrical and electronic engineering::Electronic systems::Signal processing
url http://hdl.handle.net/10356/55410
work_keys_str_mv AT liuyisi eegbasedemotionrecognitionforrealtimeapplications