ResNet-SE: Channel Attention-Based Deep Residual Network for Complex Activity Recognition Using Wrist-Worn Wearable Sensors

Smart mobile devices are being widely used to identify and track human behaviors in simple and complex daily activities. The evolution of wearable sensing technologies pertaining to wellness, living surveillance, and fitness tracking is based on the accurate analysis of people’s behavior...

Full description

Bibliographic Details
Main Authors: Sakorn Mekruksavanich, Anuchit Jitpattanakul, Kanokwan Sitthithakerngkiet, Phichai Youplao, Preecha Yupapin
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9771436/
Description
Summary:Smart mobile devices are being widely used to identify and track human behaviors in simple and complex daily activities. The evolution of wearable sensing technologies pertaining to wellness, living surveillance, and fitness tracking is based on the accurate analysis of people’s behavior from the data acquired through different sensors embedded in smart devices, especially wrist-worn wearable technologies such as smartwatches. Many deep learning techniques have been developed to realize human activity recognition (HAR), with simple daily activities being focused on. However, several challenges remain to be addressed in complex HAR research involving specific human behaviors in different contexts. To address the problems pertaining to complex HAR, a deep neural network composed of convolutional layers and residual networks was developed in this work. Additional attention was incorporated in the system by using a squeeze-and-excite mechanism. The model effectiveness was investigated considering three publicly available datasets, (WISDM-HARB, UT-Smoke, and UT-Complex). The proposed network achieved overall accuracies of 94.91%, 98.75%, and 97.73% over WISDM-HARB, UT-Smoke, and UT-Complex, respectively. The results showed that deep residual networks are more durable and superior at activity recognition than the existing models.
ISSN:2169-3536