Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling

In this study, we focus on the egocentric segmentation of arms to improve self-perception in Augmented Virtuality (AV). The main contributions of this work are: i) a comprehensive survey of segmentation algorithms for AV; ii) an Egocentric Arm Segmentation Dataset (EgoArm), composed of more than 10,...

Full description

Bibliographic Details
Main Authors: Ester Gonzalez-Sosa, Pablo Perez, Ruben Tolosana, Redouane Kachach, Alvaro Villegas
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9152989/
_version_ 1818579376298524672
author Ester Gonzalez-Sosa
Pablo Perez
Ruben Tolosana
Redouane Kachach
Alvaro Villegas
author_facet Ester Gonzalez-Sosa
Pablo Perez
Ruben Tolosana
Redouane Kachach
Alvaro Villegas
author_sort Ester Gonzalez-Sosa
collection DOAJ
description In this study, we focus on the egocentric segmentation of arms to improve self-perception in Augmented Virtuality (AV). The main contributions of this work are: i) a comprehensive survey of segmentation algorithms for AV; ii) an Egocentric Arm Segmentation Dataset (EgoArm), composed of more than 10, 000 images, demographically inclusive (variations of skin color, and gender), and open for research purposes. We also provide all details required for the automated generation of groundtruth and semi-synthetic images; iii) the proposal of a deep learning network to segment arms in AV; iv) a detailed quantitative and qualitative evaluation to showcase the usefulness of the deep network and EgoArm dataset, reporting results on different real egocentric hand datasets, including GTEA Gaze+, EDSH, EgoHands, Ego Youtube Hands, THU-Read, TEgO, FPAB, and Ego Gesture, which allow for direct comparisons with existing approaches using color or depth. Results confirm the suitability of the EgoArm dataset for this task, achieving improvements up to 40% with respect to the baseline network, depending on the particular dataset. Results also suggest that, while approaches based on color or depth can work under controlled conditions (lack of occlusion, uniform lighting, only objects of interest in the near range, controlled background, etc.), deep learning is more robust in real AV applications.
first_indexed 2024-12-16T07:00:43Z
format Article
id doaj.art-ad91acc684b944029512d66c18c6b40b
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-16T07:00:43Z
publishDate 2020-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-ad91acc684b944029512d66c18c6b40b2022-12-21T22:40:11ZengIEEEIEEE Access2169-35362020-01-01814688714690010.1109/ACCESS.2020.30130169152989Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic LabelingEster Gonzalez-Sosa0https://orcid.org/0000-0002-2428-3792Pablo Perez1https://orcid.org/0000-0002-3502-6791Ruben Tolosana2https://orcid.org/0000-0002-9393-3066Redouane Kachach3Alvaro Villegas4Nokia Bell Labs, Madrid, SpainNokia Bell Labs, Madrid, SpainDepartamento de Tecnología Electrónica y de las Comunicaciones, Universidad Autónoma de Madrid, Madrid, SpainNokia Bell Labs, Madrid, SpainNokia Bell Labs, Madrid, SpainIn this study, we focus on the egocentric segmentation of arms to improve self-perception in Augmented Virtuality (AV). The main contributions of this work are: i) a comprehensive survey of segmentation algorithms for AV; ii) an Egocentric Arm Segmentation Dataset (EgoArm), composed of more than 10, 000 images, demographically inclusive (variations of skin color, and gender), and open for research purposes. We also provide all details required for the automated generation of groundtruth and semi-synthetic images; iii) the proposal of a deep learning network to segment arms in AV; iv) a detailed quantitative and qualitative evaluation to showcase the usefulness of the deep network and EgoArm dataset, reporting results on different real egocentric hand datasets, including GTEA Gaze+, EDSH, EgoHands, Ego Youtube Hands, THU-Read, TEgO, FPAB, and Ego Gesture, which allow for direct comparisons with existing approaches using color or depth. Results confirm the suitability of the EgoArm dataset for this task, achieving improvements up to 40% with respect to the baseline network, depending on the particular dataset. Results also suggest that, while approaches based on color or depth can work under controlled conditions (lack of occlusion, uniform lighting, only objects of interest in the near range, controlled background, etc.), deep learning is more robust in real AV applications.https://ieeexplore.ieee.org/document/9152989/Egocentric arm segmentationmixed realityaugmented virtualityself-perceptionarm segmentationautomatic labeling
spellingShingle Ester Gonzalez-Sosa
Pablo Perez
Ruben Tolosana
Redouane Kachach
Alvaro Villegas
Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
IEEE Access
Egocentric arm segmentation
mixed reality
augmented virtuality
self-perception
arm segmentation
automatic labeling
title Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
title_full Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
title_fullStr Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
title_full_unstemmed Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
title_short Enhanced Self-Perception in Mixed Reality: Egocentric Arm Segmentation and Database With Automatic Labeling
title_sort enhanced self perception in mixed reality egocentric arm segmentation and database with automatic labeling
topic Egocentric arm segmentation
mixed reality
augmented virtuality
self-perception
arm segmentation
automatic labeling
url https://ieeexplore.ieee.org/document/9152989/
work_keys_str_mv AT estergonzalezsosa enhancedselfperceptioninmixedrealityegocentricarmsegmentationanddatabasewithautomaticlabeling
AT pabloperez enhancedselfperceptioninmixedrealityegocentricarmsegmentationanddatabasewithautomaticlabeling
AT rubentolosana enhancedselfperceptioninmixedrealityegocentricarmsegmentationanddatabasewithautomaticlabeling
AT redouanekachach enhancedselfperceptioninmixedrealityegocentricarmsegmentationanddatabasewithautomaticlabeling
AT alvarovillegas enhancedselfperceptioninmixedrealityegocentricarmsegmentationanddatabasewithautomaticlabeling