PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation
Abstract Global optical flow estimation is the foundation stone for obtaining odometry which is used to enable aerial robot navigation. However, such a method has to be of low latency and high robustness whilst also respecting the size, weight, area and power (SWAP) constraints of the robot. A combi...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-08-01
|
Series: | Electronics Letters |
Subjects: | |
Online Access: | https://doi.org/10.1049/ell2.12274 |
_version_ | 1811261109055782912 |
---|---|
author | Nitin J. Sanket Chahat Deep Singh Cornelia Fermüller Yiannis Aloimonos |
author_facet | Nitin J. Sanket Chahat Deep Singh Cornelia Fermüller Yiannis Aloimonos |
author_sort | Nitin J. Sanket |
collection | DOAJ |
description | Abstract Global optical flow estimation is the foundation stone for obtaining odometry which is used to enable aerial robot navigation. However, such a method has to be of low latency and high robustness whilst also respecting the size, weight, area and power (SWAP) constraints of the robot. A combination of cameras coupled with inertial measurement units (IMUs) has proven to be the best combination in order to obtain such low latency odometry on resource‐constrained aerial robots. Recently, deep learning approaches for visual inertial fusion have gained momentum due to their high accuracy and robustness. However, an equally noteworthy benefit for robotics of these techniques are their inherent scalability (adaptation to different sized aerial robots) and unification (same method works on different sized aerial robots). To this end, we present a deep learning approach called PRGFlow for obtaining global optical flow and then loosely fuse it with an IMU for full 6‐DoF (Degrees of Freedom) relative pose estimation (which is then integrated to obtain odometry). The network is evaluated on the MSCOCO dataset and the dead‐reckoned odometry on multiple real‐flight trajectories without any fine‐tuning or re‐training. A detailed benchmark comparing different network architectures and loss functions to enable scalability is also presented. It is shown that the method outperforms classical feature matching methods by 2× under noisy data. The supplementary material and code can be found at http://prg.cs.umd.edu/PRGFlow. |
first_indexed | 2024-04-12T18:57:32Z |
format | Article |
id | doaj.art-51bc84a3de91420f92c3f1ac8725a706 |
institution | Directory Open Access Journal |
issn | 0013-5194 1350-911X |
language | English |
last_indexed | 2024-04-12T18:57:32Z |
publishDate | 2021-08-01 |
publisher | Wiley |
record_format | Article |
series | Electronics Letters |
spelling | doaj.art-51bc84a3de91420f92c3f1ac8725a7062022-12-22T03:20:16ZengWileyElectronics Letters0013-51941350-911X2021-08-01571661461710.1049/ell2.12274PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigationNitin J. Sanket0Chahat Deep Singh1Cornelia Fermüller2Yiannis Aloimonos3Perception and Robotics Group University of Maryland, College ParkPerception and Robotics Group University of Maryland, College ParkPerception and Robotics Group University of Maryland, College ParkPerception and Robotics Group University of Maryland, College ParkAbstract Global optical flow estimation is the foundation stone for obtaining odometry which is used to enable aerial robot navigation. However, such a method has to be of low latency and high robustness whilst also respecting the size, weight, area and power (SWAP) constraints of the robot. A combination of cameras coupled with inertial measurement units (IMUs) has proven to be the best combination in order to obtain such low latency odometry on resource‐constrained aerial robots. Recently, deep learning approaches for visual inertial fusion have gained momentum due to their high accuracy and robustness. However, an equally noteworthy benefit for robotics of these techniques are their inherent scalability (adaptation to different sized aerial robots) and unification (same method works on different sized aerial robots). To this end, we present a deep learning approach called PRGFlow for obtaining global optical flow and then loosely fuse it with an IMU for full 6‐DoF (Degrees of Freedom) relative pose estimation (which is then integrated to obtain odometry). The network is evaluated on the MSCOCO dataset and the dead‐reckoned odometry on multiple real‐flight trajectories without any fine‐tuning or re‐training. A detailed benchmark comparing different network architectures and loss functions to enable scalability is also presented. It is shown that the method outperforms classical feature matching methods by 2× under noisy data. The supplementary material and code can be found at http://prg.cs.umd.edu/PRGFlow.https://doi.org/10.1049/ell2.12274Optical, image and video signal processingImage recognitionOptimisation techniquesSpatial variables controlTransducers and sensing devicesAerospace control |
spellingShingle | Nitin J. Sanket Chahat Deep Singh Cornelia Fermüller Yiannis Aloimonos PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation Electronics Letters Optical, image and video signal processing Image recognition Optimisation techniques Spatial variables control Transducers and sensing devices Aerospace control |
title | PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation |
title_full | PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation |
title_fullStr | PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation |
title_full_unstemmed | PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation |
title_short | PRGFlow: Unified SWAP‐aware deep global optical flow for aerial robot navigation |
title_sort | prgflow unified swap aware deep global optical flow for aerial robot navigation |
topic | Optical, image and video signal processing Image recognition Optimisation techniques Spatial variables control Transducers and sensing devices Aerospace control |
url | https://doi.org/10.1049/ell2.12274 |
work_keys_str_mv | AT nitinjsanket prgflowunifiedswapawaredeepglobalopticalflowforaerialrobotnavigation AT chahatdeepsingh prgflowunifiedswapawaredeepglobalopticalflowforaerialrobotnavigation AT corneliafermuller prgflowunifiedswapawaredeepglobalopticalflowforaerialrobotnavigation AT yiannisaloimonos prgflowunifiedswapawaredeepglobalopticalflowforaerialrobotnavigation |