Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning
It is of paramount importance to track the cognitive activity or cognitve attenion of the service personnel in a Prognostics and Health Monitoring (PHM) service related training or operation environment. The electroencephalography (EEG) data is one of the good candidates for cognitive activity recog...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
The Prognostics and Health Management Society
2016-12-01
|
Series: | International Journal of Prognostics and Health Management |
Subjects: | |
Online Access: | https://papers.phmsociety.org/index.php/ijphm/article/view/2459 |
_version_ | 1818677259513364480 |
---|---|
author | Soumalya Sarkar Kishore Reddy Alex Dorgan Cali Fidopiastis Michael Giering |
author_facet | Soumalya Sarkar Kishore Reddy Alex Dorgan Cali Fidopiastis Michael Giering |
author_sort | Soumalya Sarkar |
collection | DOAJ |
description | It is of paramount importance to track the cognitive activity or cognitve attenion of the service personnel in a Prognostics and Health Monitoring (PHM) service related training or operation environment. The electroencephalography (EEG) data is one of the good candidates for cognitive activity recognition of the user. Analyzing electroencephalography (EEG) data in an unconstrained (natural) environment for understanding cognitive state and classifying human activity is a challenging task due to multiple reasons such as low signal-to-noise ratio, transient nature, lack of baseline availability and uncontrolled mixing of various tasks. This paper proposes a framework based on an emerging tool named deep learning that monitors human activity by fusing multiple EEG sensors and also selects a smaller sensor suite for a lean data collection system. Real-time classification of human activity from spatially non collocated multi-probe EEG is executed by applying deep learning techniques without performing any significant amount of data preprocessing and manual feature engineering. Two types of deep neural networks, deep belief network (DBN) and deep convolutional neural network (DCNN) are used at the core of the proposed framework, which automatically learns necessary features from EEG for a given classification task. Validation on extensive amount of data, which was collected from several subjects while they were performing multiple tasks (listening and watching) in PHM service training session, is presented and significant parallels are drawn from existing domain knowledge on EEG data understanding. Comparison with machine learning benchmark techniques shows that deep learning based tools are better at understanding EEG data for task classification. It is observed via sensor selection that a significantly smaller EEG sensor suite can perform at a comparable accuracy as the original sensor suite. |
first_indexed | 2024-12-17T08:56:32Z |
format | Article |
id | doaj.art-082bfd4fcae74ab3adfe446e8e6a4092 |
institution | Directory Open Access Journal |
issn | 2153-2648 2153-2648 |
language | English |
last_indexed | 2024-12-17T08:56:32Z |
publishDate | 2016-12-01 |
publisher | The Prognostics and Health Management Society |
record_format | Article |
series | International Journal of Prognostics and Health Management |
spelling | doaj.art-082bfd4fcae74ab3adfe446e8e6a40922022-12-21T21:55:55ZengThe Prognostics and Health Management SocietyInternational Journal of Prognostics and Health Management2153-26482153-26482016-12-0174doi:10.36001/ijphm.2016.v7i4.2459Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep LearningSoumalya Sarkar0Kishore Reddy1Alex Dorgan2Cali Fidopiastis3Michael Giering4United Technologies Research Center, East Hartford, CT 06118, USAUnited Technologies Research Center, East Hartford, CT 06118, USAUnited Technologies Research Center, East Hartford, CT 06118, USAUnited Technologies Research Center, East Hartford, CT 06118, USAUnited Technologies Research Center, East Hartford, CT 06118, USAIt is of paramount importance to track the cognitive activity or cognitve attenion of the service personnel in a Prognostics and Health Monitoring (PHM) service related training or operation environment. The electroencephalography (EEG) data is one of the good candidates for cognitive activity recognition of the user. Analyzing electroencephalography (EEG) data in an unconstrained (natural) environment for understanding cognitive state and classifying human activity is a challenging task due to multiple reasons such as low signal-to-noise ratio, transient nature, lack of baseline availability and uncontrolled mixing of various tasks. This paper proposes a framework based on an emerging tool named deep learning that monitors human activity by fusing multiple EEG sensors and also selects a smaller sensor suite for a lean data collection system. Real-time classification of human activity from spatially non collocated multi-probe EEG is executed by applying deep learning techniques without performing any significant amount of data preprocessing and manual feature engineering. Two types of deep neural networks, deep belief network (DBN) and deep convolutional neural network (DCNN) are used at the core of the proposed framework, which automatically learns necessary features from EEG for a given classification task. Validation on extensive amount of data, which was collected from several subjects while they were performing multiple tasks (listening and watching) in PHM service training session, is presented and significant parallels are drawn from existing domain knowledge on EEG data understanding. Comparison with machine learning benchmark techniques shows that deep learning based tools are better at understanding EEG data for task classification. It is observed via sensor selection that a significantly smaller EEG sensor suite can perform at a comparable accuracy as the original sensor suite.https://papers.phmsociety.org/index.php/ijphm/article/view/2459deep learningwearablesphm trainingactivity recognitioneegsensor eliminationmulti-modal sensor fusion |
spellingShingle | Soumalya Sarkar Kishore Reddy Alex Dorgan Cali Fidopiastis Michael Giering Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning International Journal of Prognostics and Health Management deep learning wearables phm training activity recognition eeg sensor elimination multi-modal sensor fusion |
title | Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning |
title_full | Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning |
title_fullStr | Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning |
title_full_unstemmed | Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning |
title_short | Wearable EEG-based Activity Recognition in PHM-related Service Environment via Deep Learning |
title_sort | wearable eeg based activity recognition in phm related service environment via deep learning |
topic | deep learning wearables phm training activity recognition eeg sensor elimination multi-modal sensor fusion |
url | https://papers.phmsociety.org/index.php/ijphm/article/view/2459 |
work_keys_str_mv | AT soumalyasarkar wearableeegbasedactivityrecognitioninphmrelatedserviceenvironmentviadeeplearning AT kishorereddy wearableeegbasedactivityrecognitioninphmrelatedserviceenvironmentviadeeplearning AT alexdorgan wearableeegbasedactivityrecognitioninphmrelatedserviceenvironmentviadeeplearning AT califidopiastis wearableeegbasedactivityrecognitioninphmrelatedserviceenvironmentviadeeplearning AT michaelgiering wearableeegbasedactivityrecognitioninphmrelatedserviceenvironmentviadeeplearning |