Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications
Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been te...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9078767/ |
_version_ | 1818875466941988864 |
---|---|
author | Alejandro Cartas Petia Radeva Mariella Dimiccoli |
author_facet | Alejandro Cartas Petia Radeva Mariella Dimiccoli |
author_sort | Alejandro Cartas |
collection | DOAJ |
description | Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications. |
first_indexed | 2024-12-19T13:26:57Z |
format | Article |
id | doaj.art-624f0ebeb0c1485484a6293e80ff1d85 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-19T13:26:57Z |
publishDate | 2020-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-624f0ebeb0c1485484a6293e80ff1d852022-12-21T20:19:32ZengIEEEIEEE Access2169-35362020-01-018773447736310.1109/ACCESS.2020.29903339078767Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World ApplicationsAlejandro Cartas0https://orcid.org/0000-0002-4440-9954Petia Radeva1Mariella Dimiccoli2Mathematics and Computer Science Department, University of Barcelona, Barcelona, SpainMathematics and Computer Science Department, University of Barcelona, Barcelona, SpainInstitut de Robòtica i Informàtica Industrial, CSIC-UPC, Barcelona, SpainActivity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.https://ieeexplore.ieee.org/document/9078767/Daily activity recognitionvisual lifelogsdomain adaptationwearable cameras |
spellingShingle | Alejandro Cartas Petia Radeva Mariella Dimiccoli Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications IEEE Access Daily activity recognition visual lifelogs domain adaptation wearable cameras |
title | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_full | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_fullStr | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_full_unstemmed | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_short | Activities of Daily Living Monitoring via a Wearable Camera: Toward Real-World Applications |
title_sort | activities of daily living monitoring via a wearable camera toward real world applications |
topic | Daily activity recognition visual lifelogs domain adaptation wearable cameras |
url | https://ieeexplore.ieee.org/document/9078767/ |
work_keys_str_mv | AT alejandrocartas activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications AT petiaradeva activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications AT marielladimiccoli activitiesofdailylivingmonitoringviaawearablecameratowardrealworldapplications |