Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations

This paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navig...

Full description

Bibliographic Details
Main Authors: Manuel Sánchez, Jesús Morales, Jorge L. Martínez, J. J. Fernández-Lozano, Alfonso García-Cerezo
Format: Article
Language:English
Published: MDPI AG 2022-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/15/5599
_version_ 1797432709295898624
author Manuel Sánchez
Jesús Morales
Jorge L. Martínez
J. J. Fernández-Lozano
Alfonso García-Cerezo
author_facet Manuel Sánchez
Jesús Morales
Jorge L. Martínez
J. J. Fernández-Lozano
Alfonso García-Cerezo
author_sort Manuel Sánchez
collection DOAJ
description This paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been released.
first_indexed 2024-03-09T10:05:32Z
format Article
id doaj.art-7acee4c47b244e2c8314222666b0a679
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T10:05:32Z
publishDate 2022-07-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-7acee4c47b244e2c8314222666b0a6792023-12-01T23:09:29ZengMDPI AGSensors1424-82202022-07-012215559910.3390/s22155599Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo SimulationsManuel Sánchez0Jesús Morales1Jorge L. Martínez2J. J. Fernández-Lozano3Alfonso García-Cerezo4Robotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, SpainRobotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, SpainRobotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, SpainRobotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, SpainRobotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, SpainThis paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been released.https://www.mdpi.com/1424-8220/22/15/5599synthetic datasetGazebo simulatorUGV navigationnatural environmentsautomatic data labeling3D LiDAR
spellingShingle Manuel Sánchez
Jesús Morales
Jorge L. Martínez
J. J. Fernández-Lozano
Alfonso García-Cerezo
Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
Sensors
synthetic dataset
Gazebo simulator
UGV navigation
natural environments
automatic data labeling
3D LiDAR
title Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
title_full Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
title_fullStr Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
title_full_unstemmed Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
title_short Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations
title_sort automatically annotated dataset of a ground mobile robot in natural environments via gazebo simulations
topic synthetic dataset
Gazebo simulator
UGV navigation
natural environments
automatic data labeling
3D LiDAR
url https://www.mdpi.com/1424-8220/22/15/5599
work_keys_str_mv AT manuelsanchez automaticallyannotateddatasetofagroundmobilerobotinnaturalenvironmentsviagazebosimulations
AT jesusmorales automaticallyannotateddatasetofagroundmobilerobotinnaturalenvironmentsviagazebosimulations
AT jorgelmartinez automaticallyannotateddatasetofagroundmobilerobotinnaturalenvironmentsviagazebosimulations
AT jjfernandezlozano automaticallyannotateddatasetofagroundmobilerobotinnaturalenvironmentsviagazebosimulations
AT alfonsogarciacerezo automaticallyannotateddatasetofagroundmobilerobotinnaturalenvironmentsviagazebosimulations