Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks

Human activity recognition (HAR) identifies people’s motions and actions in daily life. HAR research has grown with the popularity of internet-connected, wearable sensors that capture human movement data to detect activities. Recent deep learning advances have enabled more HAR research and applicati...

Full description

Bibliographic Details
Main Authors: Sakorn Mekruksavanich, Anuchit Jitpattanakul
Format: Article
Language:English
Published: MDPI AG 2024-03-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/14/5/2107
_version_ 1797264811476647936
author Sakorn Mekruksavanich
Anuchit Jitpattanakul
author_facet Sakorn Mekruksavanich
Anuchit Jitpattanakul
author_sort Sakorn Mekruksavanich
collection DOAJ
description Human activity recognition (HAR) identifies people’s motions and actions in daily life. HAR research has grown with the popularity of internet-connected, wearable sensors that capture human movement data to detect activities. Recent deep learning advances have enabled more HAR research and applications using data from wearable devices. However, prior HAR research often focused on a few sensor locations on the body. Recognizing real-world activities poses challenges when device positioning is uncontrolled or initial user training data are unavailable. This research analyzes the feasibility of deep learning models for both position-dependent and position-independent HAR. We introduce an advanced residual deep learning model called Att-ResBiGRU, which excels at accurate position-dependent HAR and delivers excellent performance for position-independent HAR. We evaluate this model using three public HAR datasets: Opportunity, PAMAP2, and REALWORLD16. Comparisons are made to previously published deep learning architectures for addressing HAR challenges. The proposed Att-ResBiGRU model outperforms existing techniques in accuracy, cross-entropy loss, and F1-score across all three datasets. We assess the model using k-fold cross-validation. The Att-ResBiGRU achieves F1-scores of 86.69%, 96.23%, and 96.44% on the PAMAP2, REALWORLD16, and Opportunity datasets, surpassing state-of-the-art models across all datasets. Our experiments and analysis demonstrate the exceptional performance of the Att-ResBiGRU model for HAR applications.
first_indexed 2024-04-25T00:34:50Z
format Article
id doaj.art-dc2b37ee84ae4d2b89cfddf7c086e0f9
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-04-25T00:34:50Z
publishDate 2024-03-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-dc2b37ee84ae4d2b89cfddf7c086e0f92024-03-12T16:40:08ZengMDPI AGApplied Sciences2076-34172024-03-01145210710.3390/app14052107Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural NetworksSakorn Mekruksavanich0Anuchit Jitpattanakul1Department of Computer Engineering, School of Information and Communication Technology, University of Phayao, Phayao 56000, ThailandDepartment of Mathematics, Faculty of Applied Science, King Mongkut’s University of Technology North Bangkok, Bangkok 10800, ThailandHuman activity recognition (HAR) identifies people’s motions and actions in daily life. HAR research has grown with the popularity of internet-connected, wearable sensors that capture human movement data to detect activities. Recent deep learning advances have enabled more HAR research and applications using data from wearable devices. However, prior HAR research often focused on a few sensor locations on the body. Recognizing real-world activities poses challenges when device positioning is uncontrolled or initial user training data are unavailable. This research analyzes the feasibility of deep learning models for both position-dependent and position-independent HAR. We introduce an advanced residual deep learning model called Att-ResBiGRU, which excels at accurate position-dependent HAR and delivers excellent performance for position-independent HAR. We evaluate this model using three public HAR datasets: Opportunity, PAMAP2, and REALWORLD16. Comparisons are made to previously published deep learning architectures for addressing HAR challenges. The proposed Att-ResBiGRU model outperforms existing techniques in accuracy, cross-entropy loss, and F1-score across all three datasets. We assess the model using k-fold cross-validation. The Att-ResBiGRU achieves F1-scores of 86.69%, 96.23%, and 96.44% on the PAMAP2, REALWORLD16, and Opportunity datasets, surpassing state-of-the-art models across all datasets. Our experiments and analysis demonstrate the exceptional performance of the Att-ResBiGRU model for HAR applications.https://www.mdpi.com/2076-3417/14/5/2107human activity recognitionposition-independent sensingwearable sensorsdeep learningresidual neural network
spellingShingle Sakorn Mekruksavanich
Anuchit Jitpattanakul
Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
Applied Sciences
human activity recognition
position-independent sensing
wearable sensors
deep learning
residual neural network
title Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
title_full Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
title_fullStr Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
title_full_unstemmed Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
title_short Device Position-Independent Human Activity Recognition with Wearable Sensors Using Deep Neural Networks
title_sort device position independent human activity recognition with wearable sensors using deep neural networks
topic human activity recognition
position-independent sensing
wearable sensors
deep learning
residual neural network
url https://www.mdpi.com/2076-3417/14/5/2107
work_keys_str_mv AT sakornmekruksavanich devicepositionindependenthumanactivityrecognitionwithwearablesensorsusingdeepneuralnetworks
AT anuchitjitpattanakul devicepositionindependenthumanactivityrecognitionwithwearablesensorsusingdeepneuralnetworks