FSMI: Fast computation of Shannon mutual information for information-theoretic mapping
© The Author(s) 2020. Exploration tasks are embedded in many robotics applications, such as search and rescue and space exploration. Information-based exploration algorithms aim to find the most informative trajectories by maximizing an information-theoretic metric, such as the mutual information be...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
SAGE Publications
2021
|
Online Access: | https://hdl.handle.net/1721.1/133010 |
_version_ | 1826203896963399680 |
---|---|
author | Zhang, Zhengdong Henderson, Theia Karaman, Sertac Sze, Vivienne |
author2 | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
author_facet | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Zhang, Zhengdong Henderson, Theia Karaman, Sertac Sze, Vivienne |
author_sort | Zhang, Zhengdong |
collection | MIT |
description | © The Author(s) 2020. Exploration tasks are embedded in many robotics applications, such as search and rescue and space exploration. Information-based exploration algorithms aim to find the most informative trajectories by maximizing an information-theoretic metric, such as the mutual information between the map and potential future measurements. Unfortunately, most existing information-based exploration algorithms are plagued by the computational difficulty of evaluating the Shannon mutual information metric. In this article, we consider the fundamental problem of evaluating Shannon mutual information between the map and a range measurement. First, we consider 2D environments. We propose a novel algorithm, called the fast Shannon mutual information (FSMI). The key insight behind the algorithm is that a certain integral can be computed analytically, leading to substantial computational savings. Second, we consider 3D environments, represented by efficient data structures, e.g., an OctoMap, such that the measurements are compressed by run-length encoding (RLE). We propose a novel algorithm, called FSMI-RLE, that efficiently evaluates the Shannon mutual information when the measurements are compressed using RLE. For both the FSMI and the FSMI-RLE, we also propose variants that make different assumptions on the sensor noise distribution for the purpose of further computational savings. We evaluate the proposed algorithms in extensive experiments. In particular, we show that the proposed algorithms outperform existing algorithms that compute Shannon mutual information as well as other algorithms that compute the Cauchy–Schwarz quadratic mutual information (CSQMI). In addition, we demonstrate the computation of Shannon mutual information on a 3D map for the first time. |
first_indexed | 2024-09-23T12:45:04Z |
format | Article |
id | mit-1721.1/133010 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T12:45:04Z |
publishDate | 2021 |
publisher | SAGE Publications |
record_format | dspace |
spelling | mit-1721.1/1330102022-09-28T09:52:07Z FSMI: Fast computation of Shannon mutual information for information-theoretic mapping Zhang, Zhengdong Henderson, Theia Karaman, Sertac Sze, Vivienne Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. Department of Aeronautics and Astronautics Massachusetts Institute of Technology. Microsystems Technology Laboratories © The Author(s) 2020. Exploration tasks are embedded in many robotics applications, such as search and rescue and space exploration. Information-based exploration algorithms aim to find the most informative trajectories by maximizing an information-theoretic metric, such as the mutual information between the map and potential future measurements. Unfortunately, most existing information-based exploration algorithms are plagued by the computational difficulty of evaluating the Shannon mutual information metric. In this article, we consider the fundamental problem of evaluating Shannon mutual information between the map and a range measurement. First, we consider 2D environments. We propose a novel algorithm, called the fast Shannon mutual information (FSMI). The key insight behind the algorithm is that a certain integral can be computed analytically, leading to substantial computational savings. Second, we consider 3D environments, represented by efficient data structures, e.g., an OctoMap, such that the measurements are compressed by run-length encoding (RLE). We propose a novel algorithm, called FSMI-RLE, that efficiently evaluates the Shannon mutual information when the measurements are compressed using RLE. For both the FSMI and the FSMI-RLE, we also propose variants that make different assumptions on the sensor noise distribution for the purpose of further computational savings. We evaluate the proposed algorithms in extensive experiments. In particular, we show that the proposed algorithms outperform existing algorithms that compute Shannon mutual information as well as other algorithms that compute the Cauchy–Schwarz quadratic mutual information (CSQMI). In addition, we demonstrate the computation of Shannon mutual information on a 3D map for the first time. 2021-10-15T19:48:32Z 2021-10-15T19:48:32Z 2020 2021-04-08T14:13:52Z Article http://purl.org/eprint/type/JournalArticle https://hdl.handle.net/1721.1/133010 Zhang, Zhengdong, Henderson, Theia, Karaman, Sertac and Sze, Vivienne. 2020. "FSMI: Fast computation of Shannon mutual information for information-theoretic mapping." International Journal of Robotics Research, 39 (9). en 10.1177/0278364920921941 International Journal of Robotics Research Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf SAGE Publications arXiv |
spellingShingle | Zhang, Zhengdong Henderson, Theia Karaman, Sertac Sze, Vivienne FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title | FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title_full | FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title_fullStr | FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title_full_unstemmed | FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title_short | FSMI: Fast computation of Shannon mutual information for information-theoretic mapping |
title_sort | fsmi fast computation of shannon mutual information for information theoretic mapping |
url | https://hdl.handle.net/1721.1/133010 |
work_keys_str_mv | AT zhangzhengdong fsmifastcomputationofshannonmutualinformationforinformationtheoreticmapping AT hendersontheia fsmifastcomputationofshannonmutualinformationforinformationtheoreticmapping AT karamansertac fsmifastcomputationofshannonmutualinformationforinformationtheoreticmapping AT szevivienne fsmifastcomputationofshannonmutualinformationforinformationtheoreticmapping |