Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors
Human activity recognition (HAR) is a classification task for recognizing human movements. Methods of HAR are of great interest as they have become tools for measuring occurrences and durations of human actions, which are the basis of smart assistive technologies and manual processes analysis. Recen...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-05-01
|
Series: | Informatics |
Subjects: | |
Online Access: | http://www.mdpi.com/2227-9709/5/2/26 |
_version_ | 1818241978719010816 |
---|---|
author | Fernando Moya Rueda René Grzeszick Gernot A. Fink Sascha Feldhorst Michael ten Hompel |
author_facet | Fernando Moya Rueda René Grzeszick Gernot A. Fink Sascha Feldhorst Michael ten Hompel |
author_sort | Fernando Moya Rueda |
collection | DOAJ |
description | Human activity recognition (HAR) is a classification task for recognizing human movements. Methods of HAR are of great interest as they have become tools for measuring occurrences and durations of human actions, which are the basis of smart assistive technologies and manual processes analysis. Recently, deep neural networks have been deployed for HAR in the context of activities of daily living using multichannel time-series. These time-series are acquired from body-worn devices, which are composed of different types of sensors. The deep architectures process these measurements for finding basic and complex features in human corporal movements, and for classifying them into a set of human actions. As the devices are worn at different parts of the human body, we propose a novel deep neural network for HAR. This network handles sequence measurements from different body-worn devices separately. An evaluation of the architecture is performed on three datasets, the Oportunity, Pamap2, and an industrial dataset, outperforming the state-of-the-art. In addition, different network configurations will also be evaluated. We find that applying convolutions per sensor channel and per body-worn device improves the capabilities of convolutional neural network (CNNs). |
first_indexed | 2024-12-12T13:37:56Z |
format | Article |
id | doaj.art-87208843905d4934bf5997492aaadac5 |
institution | Directory Open Access Journal |
issn | 2227-9709 |
language | English |
last_indexed | 2024-12-12T13:37:56Z |
publishDate | 2018-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Informatics |
spelling | doaj.art-87208843905d4934bf5997492aaadac52022-12-22T00:22:53ZengMDPI AGInformatics2227-97092018-05-01522610.3390/informatics5020026informatics5020026Convolutional Neural Networks for Human Activity Recognition Using Body-Worn SensorsFernando Moya Rueda0René Grzeszick1Gernot A. Fink2Sascha Feldhorst3Michael ten Hompel4Department of Computer Science, TU Dortmund University, 44227 Dortmund, GermanyDepartment of Computer Science, TU Dortmund University, 44227 Dortmund, GermanyDepartment of Computer Science, TU Dortmund University, 44227 Dortmund, GermanyFraunhofer IML, 44227 Dortmund, GermanyFraunhofer IML, 44227 Dortmund, GermanyHuman activity recognition (HAR) is a classification task for recognizing human movements. Methods of HAR are of great interest as they have become tools for measuring occurrences and durations of human actions, which are the basis of smart assistive technologies and manual processes analysis. Recently, deep neural networks have been deployed for HAR in the context of activities of daily living using multichannel time-series. These time-series are acquired from body-worn devices, which are composed of different types of sensors. The deep architectures process these measurements for finding basic and complex features in human corporal movements, and for classifying them into a set of human actions. As the devices are worn at different parts of the human body, we propose a novel deep neural network for HAR. This network handles sequence measurements from different body-worn devices separately. An evaluation of the architecture is performed on three datasets, the Oportunity, Pamap2, and an industrial dataset, outperforming the state-of-the-art. In addition, different network configurations will also be evaluated. We find that applying convolutions per sensor channel and per body-worn device improves the capabilities of convolutional neural network (CNNs).http://www.mdpi.com/2227-9709/5/2/26human activity recognitionorder pickingconvolutional neural networksmultichannel time-series |
spellingShingle | Fernando Moya Rueda René Grzeszick Gernot A. Fink Sascha Feldhorst Michael ten Hompel Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors Informatics human activity recognition order picking convolutional neural networks multichannel time-series |
title | Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors |
title_full | Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors |
title_fullStr | Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors |
title_full_unstemmed | Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors |
title_short | Convolutional Neural Networks for Human Activity Recognition Using Body-Worn Sensors |
title_sort | convolutional neural networks for human activity recognition using body worn sensors |
topic | human activity recognition order picking convolutional neural networks multichannel time-series |
url | http://www.mdpi.com/2227-9709/5/2/26 |
work_keys_str_mv | AT fernandomoyarueda convolutionalneuralnetworksforhumanactivityrecognitionusingbodywornsensors AT renegrzeszick convolutionalneuralnetworksforhumanactivityrecognitionusingbodywornsensors AT gernotafink convolutionalneuralnetworksforhumanactivityrecognitionusingbodywornsensors AT saschafeldhorst convolutionalneuralnetworksforhumanactivityrecognitionusingbodywornsensors AT michaeltenhompel convolutionalneuralnetworksforhumanactivityrecognitionusingbodywornsensors |