Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors

Human activity recognition (HAR) extensively uses wearable inertial sensors since this data source provides the most information for non-visual datasets’ time series. HAR research has advanced significantly in recent years due to the proliferation of wearable devices with sensors. To improve recogni...

Full description

Bibliographic Details
Main Authors: Narit Hnoohom, Sakorn Mekruksavanich, Anuchit Jitpattanakul
Format: Article
Language:English
Published: MDPI AG 2023-01-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/12/3/693
_version_ 1797624744337473536
author Narit Hnoohom
Sakorn Mekruksavanich
Anuchit Jitpattanakul
author_facet Narit Hnoohom
Sakorn Mekruksavanich
Anuchit Jitpattanakul
author_sort Narit Hnoohom
collection DOAJ
description Human activity recognition (HAR) extensively uses wearable inertial sensors since this data source provides the most information for non-visual datasets’ time series. HAR research has advanced significantly in recent years due to the proliferation of wearable devices with sensors. To improve recognition performance, HAR researchers have extensively investigated other sources of biosignals, such as a photoplethysmograph (PPG), for this task. PPG sensors measure the rate at which blood flows through the body, and this rate is regulated by the heart’s pumping action, which constantly occurs throughout the body. Even though detecting body movement and gestures was not initially the primary purpose of PPG signals, we propose an innovative method for extracting relevant features from the PPG signal and use deep learning (DL) to predict physical activities. To accomplish the purpose of our study, we developed a deep residual network referred to as PPG-NeXt, designed based on convolutional operation, shortcut connections, and aggregated multi-branch transformation to efficiently identify different types of daily life activities from the raw PPG signal. The proposed model achieved more than 90% prediction F1-score from experimental results using only PPG data on the three benchmark datasets. Moreover, our results indicate that combining PPG and acceleration signals can enhance activity recognition. Although, both biosignals—electrocardiography (ECG) and PPG—can differentiate between stationary activities (such as sitting) and non-stationary activities (such as cycling and walking) with a level of success that is considered sufficient. Overall, our results propose that combining features from the ECG signal can be helpful in situations where pure tri-axial acceleration (3D-ACC) models have trouble differentiating between activities with relative motion (e.g., walking, stair climbing) but significant differences in their heart rate signatures.
first_indexed 2024-03-11T09:46:51Z
format Article
id doaj.art-513a319e728e4aae804f3e83ca053237
institution Directory Open Access Journal
issn 2079-9292
language English
last_indexed 2024-03-11T09:46:51Z
publishDate 2023-01-01
publisher MDPI AG
record_format Article
series Electronics
spelling doaj.art-513a319e728e4aae804f3e83ca0532372023-11-16T16:30:08ZengMDPI AGElectronics2079-92922023-01-0112369310.3390/electronics12030693Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial SensorsNarit Hnoohom0Sakorn Mekruksavanich1Anuchit Jitpattanakul2Image Information and Intelligence Laboratory, Department of Computer Engineering, Faculty of Engineering, Mahidol University, Nakhon Pathom 73170, ThailandDepartment of Computer Engineering, School of Information and Communication Technology, University of Phayao, Phayao 56000, ThailandIntelligent and Nonlinear Dynamic Innovations Research Center, Science and Technology Research Institute, King Mongkut’s University of Technology North Bangkok, Bangkok 10800, ThailandHuman activity recognition (HAR) extensively uses wearable inertial sensors since this data source provides the most information for non-visual datasets’ time series. HAR research has advanced significantly in recent years due to the proliferation of wearable devices with sensors. To improve recognition performance, HAR researchers have extensively investigated other sources of biosignals, such as a photoplethysmograph (PPG), for this task. PPG sensors measure the rate at which blood flows through the body, and this rate is regulated by the heart’s pumping action, which constantly occurs throughout the body. Even though detecting body movement and gestures was not initially the primary purpose of PPG signals, we propose an innovative method for extracting relevant features from the PPG signal and use deep learning (DL) to predict physical activities. To accomplish the purpose of our study, we developed a deep residual network referred to as PPG-NeXt, designed based on convolutional operation, shortcut connections, and aggregated multi-branch transformation to efficiently identify different types of daily life activities from the raw PPG signal. The proposed model achieved more than 90% prediction F1-score from experimental results using only PPG data on the three benchmark datasets. Moreover, our results indicate that combining PPG and acceleration signals can enhance activity recognition. Although, both biosignals—electrocardiography (ECG) and PPG—can differentiate between stationary activities (such as sitting) and non-stationary activities (such as cycling and walking) with a level of success that is considered sufficient. Overall, our results propose that combining features from the ECG signal can be helpful in situations where pure tri-axial acceleration (3D-ACC) models have trouble differentiating between activities with relative motion (e.g., walking, stair climbing) but significant differences in their heart rate signatures.https://www.mdpi.com/2079-9292/12/3/693photoplethysmographybiosignalactivity recognitiondeep residual networkwearable inertial sensor
spellingShingle Narit Hnoohom
Sakorn Mekruksavanich
Anuchit Jitpattanakul
Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
Electronics
photoplethysmography
biosignal
activity recognition
deep residual network
wearable inertial sensor
title Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
title_full Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
title_fullStr Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
title_full_unstemmed Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
title_short Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
title_sort physical activity recognition based on deep learning using photoplethysmography and wearable inertial sensors
topic photoplethysmography
biosignal
activity recognition
deep residual network
wearable inertial sensor
url https://www.mdpi.com/2079-9292/12/3/693
work_keys_str_mv AT narithnoohom physicalactivityrecognitionbasedondeeplearningusingphotoplethysmographyandwearableinertialsensors
AT sakornmekruksavanich physicalactivityrecognitionbasedondeeplearningusingphotoplethysmographyandwearableinertialsensors
AT anuchitjitpattanakul physicalactivityrecognitionbasedondeeplearningusingphotoplethysmographyandwearableinertialsensors