Deep Learning Crater Detection for Lunar Terrain Relative Navigation

© 2020, American Institute of Aeronautics and Astronautics Inc, AIAA. All rights reserved. Terrain relative navigation can improve the precision of a spacecraft’s position estimate by providing supplementary measurements to correct for drift in the inertial navigation system. This paper presents a s...

Full description

Bibliographic Details
Format: Article
Language:English
Published: American Institute of Aeronautics and Astronautics (AIAA) 2021
Online Access:https://hdl.handle.net/1721.1/137175
_version_ 1826202435838803968
collection MIT
description © 2020, American Institute of Aeronautics and Astronautics Inc, AIAA. All rights reserved. Terrain relative navigation can improve the precision of a spacecraft’s position estimate by providing supplementary measurements to correct for drift in the inertial navigation system. This paper presents a system, LunaNet, that uses a convolutional neural network to detect craters from camera imagery taken by an onboard camera. These detections are matched with known lunar craters, and these matches can be used as landmarks for localization. The motivation for generating such landmarks is to provide relative location measurements to a navigation filter, however the details of such a navigation filter are not explored within this work. Our results show that on average LunaNet detects approximately twice the number of craters in an intensity image as two other intensity image-based crater detectors. One of the challenges of cameras is that they can generate imagery with vastly different appearances depending on image qualities and noise levels. Differences in image qualities and noise levels can occur for reasons such as changes in irradiance of the lunar surface, heating of camera electronic elements, or the inherent fluctuation of discrete photons. These image noise effects are difficult to compensate for, making it important for a crater detection system to be robust to them. Convolutional neural networks have been demonstrated to be robust to these kinds of imagery variation. LunaNet is shown to be robust to four types of image manipulation that result in changes to image qualities and noise levels of the input imagery.
first_indexed 2024-09-23T12:07:26Z
format Article
id mit-1721.1/137175
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T12:07:26Z
publishDate 2021
publisher American Institute of Aeronautics and Astronautics (AIAA)
record_format dspace
spelling mit-1721.1/1371752021-11-04T03:23:28Z Deep Learning Crater Detection for Lunar Terrain Relative Navigation © 2020, American Institute of Aeronautics and Astronautics Inc, AIAA. All rights reserved. Terrain relative navigation can improve the precision of a spacecraft’s position estimate by providing supplementary measurements to correct for drift in the inertial navigation system. This paper presents a system, LunaNet, that uses a convolutional neural network to detect craters from camera imagery taken by an onboard camera. These detections are matched with known lunar craters, and these matches can be used as landmarks for localization. The motivation for generating such landmarks is to provide relative location measurements to a navigation filter, however the details of such a navigation filter are not explored within this work. Our results show that on average LunaNet detects approximately twice the number of craters in an intensity image as two other intensity image-based crater detectors. One of the challenges of cameras is that they can generate imagery with vastly different appearances depending on image qualities and noise levels. Differences in image qualities and noise levels can occur for reasons such as changes in irradiance of the lunar surface, heating of camera electronic elements, or the inherent fluctuation of discrete photons. These image noise effects are difficult to compensate for, making it important for a crater detection system to be robust to them. Convolutional neural networks have been demonstrated to be robust to these kinds of imagery variation. LunaNet is shown to be robust to four types of image manipulation that result in changes to image qualities and noise levels of the input imagery. 2021-11-03T13:59:16Z 2021-11-03T13:59:16Z 2020-01 2021-04-30T14:13:31Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137175 2020. "Deep Learning Crater Detection for Lunar Terrain Relative Navigation." AIAA Scitech 2020 Forum, 1 PartF. en 10.2514/6.2020-1838 AIAA Scitech 2020 Forum Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf American Institute of Aeronautics and Astronautics (AIAA) Other repository
spellingShingle Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title_full Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title_fullStr Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title_full_unstemmed Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title_short Deep Learning Crater Detection for Lunar Terrain Relative Navigation
title_sort deep learning crater detection for lunar terrain relative navigation
url https://hdl.handle.net/1721.1/137175