Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion
Event cameras are bio‐inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2022-12-01
|
Series: | Advanced Intelligent Systems |
Subjects: | |
Online Access: | https://doi.org/10.1002/aisy.202200221 |
_version_ | 1797978846031511552 |
---|---|
author | Suman Ghosh Guillermo Gallego |
author_facet | Suman Ghosh Guillermo Gallego |
author_sort | Suman Ghosh |
collection | DOAJ |
description | Event cameras are bio‐inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera's potential. Here, the problem of event‐based stereo 3D reconstruction for SLAM is considered. Most event‐based stereo methods attempt to exploit the high temporal resolution of the camera and the simultaneity of events across cameras to establish matches and estimate depth. By contrast, this work investigates how to estimate depth without explicit data association by fusing disparity space images (DSIs) originated in efficient monocular methods. Fusion theory is developed and applied to design multi‐camera 3D reconstruction algorithms that produce state‐of‐the‐art results, as confirmed by comparisons with four baseline methods and tests on a variety of available datasets. |
first_indexed | 2024-04-11T05:29:24Z |
format | Article |
id | doaj.art-871b501aa3bd432593b95dcdf0e639e0 |
institution | Directory Open Access Journal |
issn | 2640-4567 |
language | English |
last_indexed | 2024-04-11T05:29:24Z |
publishDate | 2022-12-01 |
publisher | Wiley |
record_format | Article |
series | Advanced Intelligent Systems |
spelling | doaj.art-871b501aa3bd432593b95dcdf0e639e02022-12-23T04:16:31ZengWileyAdvanced Intelligent Systems2640-45672022-12-01412n/an/a10.1002/aisy.202200221Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events FusionSuman Ghosh0Guillermo Gallego1Department of Electrical Engineering and Computer Science Technische Universität Berlin 10623 Berlin GermanyDepartment of Electrical Engineering and Computer Science Technische Universität Berlin 10623 Berlin GermanyEvent cameras are bio‐inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera's potential. Here, the problem of event‐based stereo 3D reconstruction for SLAM is considered. Most event‐based stereo methods attempt to exploit the high temporal resolution of the camera and the simultaneity of events across cameras to establish matches and estimate depth. By contrast, this work investigates how to estimate depth without explicit data association by fusing disparity space images (DSIs) originated in efficient monocular methods. Fusion theory is developed and applied to design multi‐camera 3D reconstruction algorithms that produce state‐of‐the‐art results, as confirmed by comparisons with four baseline methods and tests on a variety of available datasets.https://doi.org/10.1002/aisy.202200221event camerasneuromorphic processingroboticsspatial AIstereo depth estimation |
spellingShingle | Suman Ghosh Guillermo Gallego Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion Advanced Intelligent Systems event cameras neuromorphic processing robotics spatial AI stereo depth estimation |
title | Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion |
title_full | Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion |
title_fullStr | Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion |
title_full_unstemmed | Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion |
title_short | Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion |
title_sort | multi event camera depth estimation and outlier rejection by refocused events fusion |
topic | event cameras neuromorphic processing robotics spatial AI stereo depth estimation |
url | https://doi.org/10.1002/aisy.202200221 |
work_keys_str_mv | AT sumanghosh multieventcameradepthestimationandoutlierrejectionbyrefocusedeventsfusion AT guillermogallego multieventcameradepthestimationandoutlierrejectionbyrefocusedeventsfusion |