Real-Time Sonar Fusion for Layered Navigation Controller

Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an appro...

Full description

Bibliographic Details
Main Authors: Wouter Jansen, Dennis Laurijssen, Jan Steckel
Format: Article
Language:English
Published: MDPI AG 2022-04-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/9/3109
_version_ 1827671544565858304
author Wouter Jansen
Dennis Laurijssen
Jan Steckel
author_facet Wouter Jansen
Dennis Laurijssen
Jan Steckel
author_sort Wouter Jansen
collection DOAJ
description Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.
first_indexed 2024-03-10T03:44:20Z
format Article
id doaj.art-dac13a0f5c5b41d2ab6328016b83a436
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-10T03:44:20Z
publishDate 2022-04-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-dac13a0f5c5b41d2ab6328016b83a4362023-11-23T09:13:33ZengMDPI AGSensors1424-82202022-04-01229310910.3390/s22093109Real-Time Sonar Fusion for Layered Navigation ControllerWouter Jansen0Dennis Laurijssen1Jan Steckel2Cosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, BelgiumCosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, BelgiumCosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, BelgiumNavigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.https://www.mdpi.com/1424-8220/22/9/3109sonarvehicle controlbiologically-inspiredroboticssensor fusionindoor navigation
spellingShingle Wouter Jansen
Dennis Laurijssen
Jan Steckel
Real-Time Sonar Fusion for Layered Navigation Controller
Sensors
sonar
vehicle control
biologically-inspired
robotics
sensor fusion
indoor navigation
title Real-Time Sonar Fusion for Layered Navigation Controller
title_full Real-Time Sonar Fusion for Layered Navigation Controller
title_fullStr Real-Time Sonar Fusion for Layered Navigation Controller
title_full_unstemmed Real-Time Sonar Fusion for Layered Navigation Controller
title_short Real-Time Sonar Fusion for Layered Navigation Controller
title_sort real time sonar fusion for layered navigation controller
topic sonar
vehicle control
biologically-inspired
robotics
sensor fusion
indoor navigation
url https://www.mdpi.com/1424-8220/22/9/3109
work_keys_str_mv AT wouterjansen realtimesonarfusionforlayerednavigationcontroller
AT dennislaurijssen realtimesonarfusionforlayerednavigationcontroller
AT jansteckel realtimesonarfusionforlayerednavigationcontroller