Systems-driven improvements to radar-only ego-motion estimation

<p>Ego-motion estimation in robotics is typically performed with cameras or laser sensors using established techniques that have proven successful in a variety of applications. While these advances have expanded the capabilities of autonomous systems, they are limited by sensor hardware that i...

Full description

Bibliographic Details
Main Author: Aldera, R
Other Authors: Newman, P
Format: Thesis
Language:English
Published: 2021
Subjects:
_version_ 1797107671155867648
author Aldera, R
author2 Newman, P
author_facet Newman, P
Aldera, R
author_sort Aldera, R
collection OXFORD
description <p>Ego-motion estimation in robotics is typically performed with cameras or laser sensors using established techniques that have proven successful in a variety of applications. While these advances have expanded the capabilities of autonomous systems, they are limited by sensor hardware that is adversely affected by poor weather and lighting conditions. Sensing with radar offers a robust means to perceive the environment in even the harshest conditions where these optical sensors fail.</p> <p>This thesis presents strategic improvements to a radar-only ego-motion estimation pipeline to extend its utility as a core component in an autonomous system. By leveraging information pertaining to the context in which the sensor is used, we offer a means to reduce processing time by filtering out spurious returns, predict failure using an introspective component, and estimate ego-motion with much greater accuracy by considering motion constraints.</p> <p>The first contribution uses a weakly-supervised approach to train a neural network that filters out radar returns not likely to correspond to static objects in the scene. By passing only those returns that are useful for ego-motion estimation, processing times are reduced by a factor of 2.36. Secondly, we propose a system that interprets the information encoded within the data association algorithm such that the overall confidence in a match between two radar landmark sets can be determined. A classifier is trained to distinguish between strong and weak correspondences and flag poor performance prior to taking action, reducing failure frequency by 24.7 %. Finally, we present a method which leverages information regarding the physical constraints of the robot platform to refine landmark correspondences. We discard those that suggest extreme ego-motion when under the assumption that the vehicle traverses paths of constant curvature. This includes a way to estimate relative pose from a single landmark observed twice, and reduces translational error in odometry by 45.94%.</p> <p>In each instance, validation is performed on real data captured from a vehicle-mounted scanning radar to measure the performance improvement of the proposed subsystem against a baseline Radar Odometry (RO) system. Through these data-driven advances, we seek to further the use of radar as a navigation sensor, so as to enable our robots to roam with greater efficiency, safety, and accuracy than before.</p>
first_indexed 2024-03-07T07:19:10Z
format Thesis
id oxford-uuid:f6eff310-5ad7-437d-bd94-f02bbc7ddccf
institution University of Oxford
language English
last_indexed 2024-03-07T07:19:10Z
publishDate 2021
record_format dspace
spelling oxford-uuid:f6eff310-5ad7-437d-bd94-f02bbc7ddccf2022-09-12T10:08:42ZSystems-driven improvements to radar-only ego-motion estimationThesishttp://purl.org/coar/resource_type/c_db06uuid:f6eff310-5ad7-437d-bd94-f02bbc7ddccfEgo-motionRadarOdometryRoboticsLocalisationMobile roboticsEnglishHyrax Deposit2021Aldera, RNewman, P<p>Ego-motion estimation in robotics is typically performed with cameras or laser sensors using established techniques that have proven successful in a variety of applications. While these advances have expanded the capabilities of autonomous systems, they are limited by sensor hardware that is adversely affected by poor weather and lighting conditions. Sensing with radar offers a robust means to perceive the environment in even the harshest conditions where these optical sensors fail.</p> <p>This thesis presents strategic improvements to a radar-only ego-motion estimation pipeline to extend its utility as a core component in an autonomous system. By leveraging information pertaining to the context in which the sensor is used, we offer a means to reduce processing time by filtering out spurious returns, predict failure using an introspective component, and estimate ego-motion with much greater accuracy by considering motion constraints.</p> <p>The first contribution uses a weakly-supervised approach to train a neural network that filters out radar returns not likely to correspond to static objects in the scene. By passing only those returns that are useful for ego-motion estimation, processing times are reduced by a factor of 2.36. Secondly, we propose a system that interprets the information encoded within the data association algorithm such that the overall confidence in a match between two radar landmark sets can be determined. A classifier is trained to distinguish between strong and weak correspondences and flag poor performance prior to taking action, reducing failure frequency by 24.7 %. Finally, we present a method which leverages information regarding the physical constraints of the robot platform to refine landmark correspondences. We discard those that suggest extreme ego-motion when under the assumption that the vehicle traverses paths of constant curvature. This includes a way to estimate relative pose from a single landmark observed twice, and reduces translational error in odometry by 45.94%.</p> <p>In each instance, validation is performed on real data captured from a vehicle-mounted scanning radar to measure the performance improvement of the proposed subsystem against a baseline Radar Odometry (RO) system. Through these data-driven advances, we seek to further the use of radar as a navigation sensor, so as to enable our robots to roam with greater efficiency, safety, and accuracy than before.</p>
spellingShingle Ego-motion
Radar
Odometry
Robotics
Localisation
Mobile robotics
Aldera, R
Systems-driven improvements to radar-only ego-motion estimation
title Systems-driven improvements to radar-only ego-motion estimation
title_full Systems-driven improvements to radar-only ego-motion estimation
title_fullStr Systems-driven improvements to radar-only ego-motion estimation
title_full_unstemmed Systems-driven improvements to radar-only ego-motion estimation
title_short Systems-driven improvements to radar-only ego-motion estimation
title_sort systems driven improvements to radar only ego motion estimation
topic Ego-motion
Radar
Odometry
Robotics
Localisation
Mobile robotics
work_keys_str_mv AT alderar systemsdrivenimprovementstoradaronlyegomotionestimation