Vision-only localisation under extreme appearance change

<p>Robust localisation is a key requirement for autonomous vehicles. However, in order to achieve widespread adoption of this technology, we also require this function to be performed using low-cost hardware. Cameras are appealing due to their information-rich image content and low cost; howev...

Full description

Bibliographic Details
Main Author: Linegar, C
Other Authors: Newman, P
Format: Thesis
Language:English
Published: 2016
Subjects:
_version_ 1826275284150648832
author Linegar, C
author2 Newman, P
author_facet Newman, P
Linegar, C
author_sort Linegar, C
collection OXFORD
description <p>Robust localisation is a key requirement for autonomous vehicles. However, in order to achieve widespread adoption of this technology, we also require this function to be performed using low-cost hardware. Cameras are appealing due to their information-rich image content and low cost; however, camera-based localisation is difficult because of the problem of appearance change. For example, in outdoor en- vironments the appearance of the world can change dramatically and unpredictably with variations in lighting, weather, season and scene structure. We require autonomous vehicles to be robust under these challenging environmental conditions.</p> <p>This thesis presents Dub4, a vision-only localisation system for autonomous vehicles. The system is founded on the concept of experiences, where an "experience" is a visual memory which models the world under particular conditions. By allowing the system to build up and curate a map of these experiences, we are able to handle cyclic appearance change (lighting, weather and season) as well as adapt to slow structural change. We present a probabilistic framework for predicting which experiences are most likely to match successfully with the live image at run-time, conditioned on the robot's prior use of the map. In addition, we describe an unsupervised algorithm for detecting and modelling higher-level visual features in the environment for localisation. These features are trained on a per-experience basis and are robust to extreme changes in appearance, for example between rain and sun, or day and night.</p> <p>The system is tested on over 1500km of data, from urban and off-road environments, through sun, rain, snow, harsh lighting, at different times of the day and night, and through all seasons. In addition to this extensive offline testing, Dub4 has served as the primary localisation source on a number of autonomous vehicles, including the Oxford University's RobotCar, the 2016 Shell Eco-Marathon, the LUTZ PathFinder Project in Milton Keynes, and the GATEway Project in Greenwich, London.</p>
first_indexed 2024-03-06T22:56:22Z
format Thesis
id oxford-uuid:608762bd-5608-4e50-ab7b-da454dd52887
institution University of Oxford
language English
last_indexed 2024-03-06T22:56:22Z
publishDate 2016
record_format dspace
spelling oxford-uuid:608762bd-5608-4e50-ab7b-da454dd528872022-03-26T17:53:54ZVision-only localisation under extreme appearance changeThesishttp://purl.org/coar/resource_type/c_db06uuid:608762bd-5608-4e50-ab7b-da454dd52887roboticsvisionlocalisationEnglishORA Deposit2016Linegar, CNewman, P<p>Robust localisation is a key requirement for autonomous vehicles. However, in order to achieve widespread adoption of this technology, we also require this function to be performed using low-cost hardware. Cameras are appealing due to their information-rich image content and low cost; however, camera-based localisation is difficult because of the problem of appearance change. For example, in outdoor en- vironments the appearance of the world can change dramatically and unpredictably with variations in lighting, weather, season and scene structure. We require autonomous vehicles to be robust under these challenging environmental conditions.</p> <p>This thesis presents Dub4, a vision-only localisation system for autonomous vehicles. The system is founded on the concept of experiences, where an "experience" is a visual memory which models the world under particular conditions. By allowing the system to build up and curate a map of these experiences, we are able to handle cyclic appearance change (lighting, weather and season) as well as adapt to slow structural change. We present a probabilistic framework for predicting which experiences are most likely to match successfully with the live image at run-time, conditioned on the robot's prior use of the map. In addition, we describe an unsupervised algorithm for detecting and modelling higher-level visual features in the environment for localisation. These features are trained on a per-experience basis and are robust to extreme changes in appearance, for example between rain and sun, or day and night.</p> <p>The system is tested on over 1500km of data, from urban and off-road environments, through sun, rain, snow, harsh lighting, at different times of the day and night, and through all seasons. In addition to this extensive offline testing, Dub4 has served as the primary localisation source on a number of autonomous vehicles, including the Oxford University's RobotCar, the 2016 Shell Eco-Marathon, the LUTZ PathFinder Project in Milton Keynes, and the GATEway Project in Greenwich, London.</p>
spellingShingle robotics
vision
localisation
Linegar, C
Vision-only localisation under extreme appearance change
title Vision-only localisation under extreme appearance change
title_full Vision-only localisation under extreme appearance change
title_fullStr Vision-only localisation under extreme appearance change
title_full_unstemmed Vision-only localisation under extreme appearance change
title_short Vision-only localisation under extreme appearance change
title_sort vision only localisation under extreme appearance change
topic robotics
vision
localisation
work_keys_str_mv AT linegarc visiononlylocalisationunderextremeappearancechange