Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection
© 2020 AACC. Terrain relative navigation can improve the precision of a spacecraft's position estimate by detecting global features that act as supplementary measurements to correct for drift in the inertial navigation system. This paper presents a system that uses a convolutional neural networ...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
IEEE
2021
|
Online Access: | https://hdl.handle.net/1721.1/137154.2 |
_version_ | 1811074178903703552 |
---|---|
author | Downes, Lena(Lena Marie) Steiner, Ted J How, Jonathan P |
author2 | Massachusetts Institute of Technology. Department of Aeronautics and Astronautics |
author_facet | Massachusetts Institute of Technology. Department of Aeronautics and Astronautics Downes, Lena(Lena Marie) Steiner, Ted J How, Jonathan P |
author_sort | Downes, Lena(Lena Marie) |
collection | MIT |
description | © 2020 AACC. Terrain relative navigation can improve the precision of a spacecraft's position estimate by detecting global features that act as supplementary measurements to correct for drift in the inertial navigation system. This paper presents a system that uses a convolutional neural network (CNN) and image processing methods to track the location of a simulated spacecraft with an extended Kalman filter (EKF). The CNN, called LunaNet, visually detects craters in the simulated camera frame and those detections are matched to known lunar craters in the region of the current estimated spacecraft position. These matched craters are treated as features that are tracked using the EKF. LunaNet enables more reliable position tracking over a simulated trajectory due to its greater robustness to changes in image brightness and more repeatable crater detections from frame to frame throughout a trajectory. LunaNet combined with an EKF produces a decrease of 60% in the average final position estimation error and a decrease of 25% in average final velocity estimation error compared to an EKF using an image processing-based crater detection method when tested on trajectories using images of standard brightness. |
first_indexed | 2024-09-23T09:44:41Z |
format | Article |
id | mit-1721.1/137154.2 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T09:44:41Z |
publishDate | 2021 |
publisher | IEEE |
record_format | dspace |
spelling | mit-1721.1/137154.22022-09-01T18:51:36Z Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection Downes, Lena(Lena Marie) Steiner, Ted J How, Jonathan P Massachusetts Institute of Technology. Department of Aeronautics and Astronautics © 2020 AACC. Terrain relative navigation can improve the precision of a spacecraft's position estimate by detecting global features that act as supplementary measurements to correct for drift in the inertial navigation system. This paper presents a system that uses a convolutional neural network (CNN) and image processing methods to track the location of a simulated spacecraft with an extended Kalman filter (EKF). The CNN, called LunaNet, visually detects craters in the simulated camera frame and those detections are matched to known lunar craters in the region of the current estimated spacecraft position. These matched craters are treated as features that are tracked using the EKF. LunaNet enables more reliable position tracking over a simulated trajectory due to its greater robustness to changes in image brightness and more repeatable crater detections from frame to frame throughout a trajectory. LunaNet combined with an EKF produces a decrease of 60% in the average final position estimation error and a decrease of 25% in average final velocity estimation error compared to an EKF using an image processing-based crater detection method when tested on trajectories using images of standard brightness. United States. Defense Advanced Research Projects Agency 2021-12-14T18:52:04Z 2021-11-02T18:14:26Z 2021-12-14T18:52:04Z 2020-07 2021-04-30T14:23:48Z Article http://purl.org/eprint/type/JournalArticle 0743-1619 https://hdl.handle.net/1721.1/137154.2 Downes, Lena M., Steiner, Ted J. and How, Jonathan P. 2020. "Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection." Proceedings of the American Control Conference, 2020-July. en 10.23919/acc45564.2020.9147595 Proceedings of the American Control Conference Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/octet-stream IEEE arXiv |
spellingShingle | Downes, Lena(Lena Marie) Steiner, Ted J How, Jonathan P Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title | Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title_full | Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title_fullStr | Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title_full_unstemmed | Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title_short | Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection |
title_sort | lunar terrain relative navigation using a convolutional neural network for visual crater detection |
url | https://hdl.handle.net/1721.1/137154.2 |
work_keys_str_mv | AT downeslenalenamarie lunarterrainrelativenavigationusingaconvolutionalneuralnetworkforvisualcraterdetection AT steinertedj lunarterrainrelativenavigationusingaconvolutionalneuralnetworkforvisualcraterdetection AT howjonathanp lunarterrainrelativenavigationusingaconvolutionalneuralnetworkforvisualcraterdetection |