Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments
We present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and wh...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2012
|
Subjects: | |
Online Access: | https://repository.ugm.ac.id/33073/1/Virtual_Lifeline_Multimodal_Sensor_Data_Fusion.pdf |
_version_ | 1797020220441755648 |
---|---|
author | Widyawan, Widyawan |
author_facet | Widyawan, Widyawan |
author_sort | Widyawan, Widyawan |
collection | UGM |
description | We present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and where typical motion patterns defeat standard PDR systems.
We use RF and ultrasound beacons to periodically re-align the PDR system
and reduce the impact of incremental error accumulation. Unlike previous work on multimodal positioning, we allow the beacons to be dynamically deployed (dropped by the user) at previously unknown locations. A key contribution of this paper is to show that despite the fact that the beacon locations are not known (in terms of absolute coordinates), they significantly improve the performance of the system. This effect is especially relevant when a user re-traces (parts of) the path he or she had previously travelled or lingers and
moves around in an irregular pattern at single locations for extended periods of time. Both situations are common and relevant for emergency response scenarios. We describe the system architecture, the fusion algorithms and provide an in depth evaluation in a large scale, realistic experiment. |
first_indexed | 2024-03-05T23:20:06Z |
format | Article |
id | oai:generic.eprints.org:33073 |
institution | Universiti Gadjah Mada |
language | English |
last_indexed | 2024-03-13T19:12:40Z |
publishDate | 2012 |
publisher | Elsevier |
record_format | dspace |
spelling | oai:generic.eprints.org:330732014-03-19T06:43:17Z https://repository.ugm.ac.id/33073/ Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments Widyawan, Widyawan Makalah lain-lain We present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and where typical motion patterns defeat standard PDR systems. We use RF and ultrasound beacons to periodically re-align the PDR system and reduce the impact of incremental error accumulation. Unlike previous work on multimodal positioning, we allow the beacons to be dynamically deployed (dropped by the user) at previously unknown locations. A key contribution of this paper is to show that despite the fact that the beacon locations are not known (in terms of absolute coordinates), they significantly improve the performance of the system. This effect is especially relevant when a user re-traces (parts of) the path he or she had previously travelled or lingers and moves around in an irregular pattern at single locations for extended periods of time. Both situations are common and relevant for emergency response scenarios. We describe the system architecture, the fusion algorithms and provide an in depth evaluation in a large scale, realistic experiment. Elsevier 2012-06 Article PeerReviewed application/pdf en https://repository.ugm.ac.id/33073/1/Virtual_Lifeline_Multimodal_Sensor_Data_Fusion.pdf Widyawan, Widyawan (2012) Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments. Pervasive and Mobile Computing, 8 (3). pp. 388-401. ISSN 1574-1192 |
spellingShingle | Makalah lain-lain Widyawan, Widyawan Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments |
title | Virtual lifeline: Multimodal sensor data fusion for robust navigation in
unknown environments |
title_full | Virtual lifeline: Multimodal sensor data fusion for robust navigation in
unknown environments |
title_fullStr | Virtual lifeline: Multimodal sensor data fusion for robust navigation in
unknown environments |
title_full_unstemmed | Virtual lifeline: Multimodal sensor data fusion for robust navigation in
unknown environments |
title_short | Virtual lifeline: Multimodal sensor data fusion for robust navigation in
unknown environments |
title_sort | virtual lifeline multimodal sensor data fusion for robust navigation in unknown environments |
topic | Makalah lain-lain |
url | https://repository.ugm.ac.id/33073/1/Virtual_Lifeline_Multimodal_Sensor_Data_Fusion.pdf |
work_keys_str_mv | AT widyawanwidyawan virtuallifelinemultimodalsensordatafusionforrobustnavigationinunknownenvironments |