A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots
In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Thr...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-01-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/20/3/722 |
_version_ | 1798005150498947072 |
---|---|
author | Steffen Müller Tim Wengefeld Thanh Quang Trinh Dustin Aganian Markus Eisenbach Horst-Michael Gross |
author_facet | Steffen Müller Tim Wengefeld Thanh Quang Trinh Dustin Aganian Markus Eisenbach Horst-Michael Gross |
author_sort | Steffen Müller |
collection | DOAJ |
description | In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot’s user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments. |
first_indexed | 2024-04-11T12:34:29Z |
format | Article |
id | doaj.art-72ca88e90c224df8be284f6e30c344f9 |
institution | Directory Open Access Journal |
issn | 1424-8220 |
language | English |
last_indexed | 2024-04-11T12:34:29Z |
publishDate | 2020-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Sensors |
spelling | doaj.art-72ca88e90c224df8be284f6e30c344f92022-12-22T04:23:40ZengMDPI AGSensors1424-82202020-01-0120372210.3390/s20030722s20030722A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service RobotsSteffen Müller0Tim Wengefeld1Thanh Quang Trinh2Dustin Aganian3Markus Eisenbach4Horst-Michael Gross5Neuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyNeuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyNeuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyNeuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyNeuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyNeuroinformatics and Cognitive Robotics Lab of Technische Universität Ilmenau, 98684 Ilmenau, GermanyIn order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot’s user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments.https://www.mdpi.com/1424-8220/20/3/722multi modal person trackingsensor fusionuser centered robot navigation |
spellingShingle | Steffen Müller Tim Wengefeld Thanh Quang Trinh Dustin Aganian Markus Eisenbach Horst-Michael Gross A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots Sensors multi modal person tracking sensor fusion user centered robot navigation |
title | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_full | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_fullStr | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_full_unstemmed | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_short | A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots |
title_sort | multi modal person perception framework for socially interactive mobile service robots |
topic | multi modal person tracking sensor fusion user centered robot navigation |
url | https://www.mdpi.com/1424-8220/20/3/722 |
work_keys_str_mv | AT steffenmuller amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT timwengefeld amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT thanhquangtrinh amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT dustinaganian amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT markuseisenbach amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT horstmichaelgross amultimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT steffenmuller multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT timwengefeld multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT thanhquangtrinh multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT dustinaganian multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT markuseisenbach multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots AT horstmichaelgross multimodalpersonperceptionframeworkforsociallyinteractivemobileservicerobots |