A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System
Recently, egocentric activity recognition has attracted considerable attention in the pattern recognition and artificial intelligence communities because of its wide applicability in medical care, smart homes, and security monitoring. In this study, we developed and implemented a deep-learning-based...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-01-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/19/3/546 |
_version_ | 1798006264278548480 |
---|---|
author | Haibin Yu Guoxiong Pan Mian Pan Chong Li Wenyan Jia Li Zhang Mingui Sun |
author_facet | Haibin Yu Guoxiong Pan Mian Pan Chong Li Wenyan Jia Li Zhang Mingui Sun |
author_sort | Haibin Yu |
collection | DOAJ |
description | Recently, egocentric activity recognition has attracted considerable attention in the pattern recognition and artificial intelligence communities because of its wide applicability in medical care, smart homes, and security monitoring. In this study, we developed and implemented a deep-learning-based hierarchical fusion framework for the recognition of egocentric activities of daily living (ADLs) in a wearable hybrid sensor system comprising motion sensors and cameras. Long short-term memory (LSTM) and a convolutional neural network are used to perform egocentric ADL recognition based on motion sensor data and photo streaming in different layers, respectively. The motion sensor data are used solely for activity classification according to motion state, while the photo stream is used for further specific activity recognition in the motion state groups. Thus, both motion sensor data and photo stream work in their most suitable classification mode to significantly reduce the negative influence of sensor differences on the fusion results. Experimental results show that the proposed method not only is more accurate than the existing direct fusion method (by up to 6%) but also avoids the time-consuming computation of optical flow in the existing method, which makes the proposed algorithm less complex and more suitable for practical application. |
first_indexed | 2024-04-11T12:52:04Z |
format | Article |
id | doaj.art-ac58bc96baca4275bb0b51d970135bae |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-04-11T12:52:04Z |
publishDate | 2019-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-ac58bc96baca4275bb0b51d970135bae2022-12-22T04:23:10ZengMDPI AGSensors1424-82202019-01-0119354610.3390/s19030546s19030546A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor SystemHaibin Yu0Guoxiong Pan1Mian Pan2Chong Li3Wenyan Jia4Li Zhang5Mingui Sun6College of Electronics and Information, Hangzhou Dianzi University, Hangzhou 310018, ChinaCollege of Electronics and Information, Hangzhou Dianzi University, Hangzhou 310018, ChinaCollege of Electronics and Information, Hangzhou Dianzi University, Hangzhou 310018, ChinaCollege of Electronics and Information, Hangzhou Dianzi University, Hangzhou 310018, ChinaDepartment of Electrical and Computer Engineering, University of Pittsburgh, PA 15261, USASchool of Computer Science and Technology, Hangzhou Dianzi University, Hangzhou 310018, ChinaDepartment of Neurological Surgery, University of Pittsburgh, PA 15213, USARecently, egocentric activity recognition has attracted considerable attention in the pattern recognition and artificial intelligence communities because of its wide applicability in medical care, smart homes, and security monitoring. In this study, we developed and implemented a deep-learning-based hierarchical fusion framework for the recognition of egocentric activities of daily living (ADLs) in a wearable hybrid sensor system comprising motion sensors and cameras. Long short-term memory (LSTM) and a convolutional neural network are used to perform egocentric ADL recognition based on motion sensor data and photo streaming in different layers, respectively. The motion sensor data are used solely for activity classification according to motion state, while the photo stream is used for further specific activity recognition in the motion state groups. Thus, both motion sensor data and photo stream work in their most suitable classification mode to significantly reduce the negative influence of sensor differences on the fusion results. Experimental results show that the proposed method not only is more accurate than the existing direct fusion method (by up to 6%) but also avoids the time-consuming computation of optical flow in the existing method, which makes the proposed algorithm less complex and more suitable for practical application.https://www.mdpi.com/1424-8220/19/3/546deep learningegocentric activity recognitionhierarchical fusion frameworkwearable sensor system |
spellingShingle | Haibin Yu Guoxiong Pan Mian Pan Chong Li Wenyan Jia Li Zhang Mingui Sun A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System Sensors deep learning egocentric activity recognition hierarchical fusion framework wearable sensor system |
title | A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System |
title_full | A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System |
title_fullStr | A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System |
title_full_unstemmed | A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System |
title_short | A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition using a Wearable Hybrid Sensor System |
title_sort | hierarchical deep fusion framework for egocentric activity recognition using a wearable hybrid sensor system |
topic | deep learning egocentric activity recognition hierarchical fusion framework wearable sensor system |
url | https://www.mdpi.com/1424-8220/19/3/546 |
work_keys_str_mv | AT haibinyu ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT guoxiongpan ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT mianpan ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT chongli ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT wenyanjia ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT lizhang ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT minguisun ahierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT haibinyu hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT guoxiongpan hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT mianpan hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT chongli hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT wenyanjia hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT lizhang hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem AT minguisun hierarchicaldeepfusionframeworkforegocentricactivityrecognitionusingawearablehybridsensorsystem |