The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility

The increase in number of interplanetary probes has emphasized the need for spacecraft autonomy to reduce overall mission costs and to enable riskier operations without ground support. The perception of the external environment is a critical task for autonomous probes, being fundamental for both mot...

Full description

Bibliographic Details
Main Authors: Paolo Panicucci, Francesco Topputo
Format: Article
Language:English
Published: MDPI AG 2022-11-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/22/23/9333
_version_ 1797462138133938176
author Paolo Panicucci
Francesco Topputo
author_facet Paolo Panicucci
Francesco Topputo
author_sort Paolo Panicucci
collection DOAJ
description The increase in number of interplanetary probes has emphasized the need for spacecraft autonomy to reduce overall mission costs and to enable riskier operations without ground support. The perception of the external environment is a critical task for autonomous probes, being fundamental for both motion planning and actuation. Perception is often achieved using navigation sensors which provide measurements of the external environment. For space exploration purposes, cameras are among the sensors that provide navigation information with few constraints at the spacecraft system level. Image processing and vision-based navigation algorithms are exploited to extract information about the external environment and the probe’s position within it from images. It is thus crucial to have the capability to generate realistic image datasets to design, validate, and test autonomous algorithms. This goal is achieved with high-fidelity rendering engines and with hardware-in-the-loop simulations. This work focuses on the latter by presenting a facility developed and used at the Deep-space Astrodynamics Research and Technology (DART) Laboratory at Politecnico di Milano. First, the facility design relationships are established to select hardware components. The critical design parameters of the camera, lens system, and screen are identified and analytical relationships are developed among these parameters. Second, the performances achievable with the chosen components are analytically and numerically studied in terms of geometrical accuracy and optical distortions. Third, the calibration procedures compensating for hardware misalignment and errors are defined. Their performances are evaluated in a laboratory experiment to display the calibration quality. Finally, the facility applicability is demonstrated by testing imageprocessing algorithms for space exploration scenarios.
first_indexed 2024-03-09T17:32:13Z
format Article
id doaj.art-c685b1a0028b43b9becc43b7e2c00828
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-09T17:32:13Z
publishDate 2022-11-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-c685b1a0028b43b9becc43b7e2c008282023-11-24T12:12:25ZengMDPI AGSensors1424-82202022-11-012223933310.3390/s22239333The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation FacilityPaolo Panicucci0Francesco Topputo1Department of Aerospace Science and Technology, Politecnico di Milano, Via Giuseppe La Masa, 34, 20156 Milano, ItalyDepartment of Aerospace Science and Technology, Politecnico di Milano, Via Giuseppe La Masa, 34, 20156 Milano, ItalyThe increase in number of interplanetary probes has emphasized the need for spacecraft autonomy to reduce overall mission costs and to enable riskier operations without ground support. The perception of the external environment is a critical task for autonomous probes, being fundamental for both motion planning and actuation. Perception is often achieved using navigation sensors which provide measurements of the external environment. For space exploration purposes, cameras are among the sensors that provide navigation information with few constraints at the spacecraft system level. Image processing and vision-based navigation algorithms are exploited to extract information about the external environment and the probe’s position within it from images. It is thus crucial to have the capability to generate realistic image datasets to design, validate, and test autonomous algorithms. This goal is achieved with high-fidelity rendering engines and with hardware-in-the-loop simulations. This work focuses on the latter by presenting a facility developed and used at the Deep-space Astrodynamics Research and Technology (DART) Laboratory at Politecnico di Milano. First, the facility design relationships are established to select hardware components. The critical design parameters of the camera, lens system, and screen are identified and analytical relationships are developed among these parameters. Second, the performances achievable with the chosen components are analytically and numerically studied in terms of geometrical accuracy and optical distortions. Third, the calibration procedures compensating for hardware misalignment and errors are defined. Their performances are evaluated in a laboratory experiment to display the calibration quality. Finally, the facility applicability is demonstrated by testing imageprocessing algorithms for space exploration scenarios.https://www.mdpi.com/1424-8220/22/23/9333optical camera testingvision-based navigationhardware-in-the-loop simulationsoptical test benchverification and validation
spellingShingle Paolo Panicucci
Francesco Topputo
The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
Sensors
optical camera testing
vision-based navigation
hardware-in-the-loop simulations
optical test bench
verification and validation
title The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
title_full The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
title_fullStr The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
title_full_unstemmed The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
title_short The TinyV3RSE Hardware-in-the-Loop Vision-Based Navigation Facility
title_sort tinyv3rse hardware in the loop vision based navigation facility
topic optical camera testing
vision-based navigation
hardware-in-the-loop simulations
optical test bench
verification and validation
url https://www.mdpi.com/1424-8220/22/23/9333
work_keys_str_mv AT paolopanicucci thetinyv3rsehardwareintheloopvisionbasednavigationfacility
AT francescotopputo thetinyv3rsehardwareintheloopvisionbasednavigationfacility
AT paolopanicucci tinyv3rsehardwareintheloopvisionbasednavigationfacility
AT francescotopputo tinyv3rsehardwareintheloopvisionbasednavigationfacility