Visual articulated tracking in the presence of occlusions

This paper focuses on visual tracking of a robotic manipulator during manipulation. In this situation, tracking is prone to failure when visual distractions are created by the object being manipulated and the clutter in the environment. Current state-of-the-art approaches, which typically rely on mo...

Full description

Bibliographic Details
Main Authors: Rauch, C, Hospedales, T, Shotton, J, Fallon, M
Format: Conference item
Published: IEEE 2018
_version_ 1797087766091137024
author Rauch, C
Hospedales, T
Shotton, J
Fallon, M
author_facet Rauch, C
Hospedales, T
Shotton, J
Fallon, M
author_sort Rauch, C
collection OXFORD
description This paper focuses on visual tracking of a robotic manipulator during manipulation. In this situation, tracking is prone to failure when visual distractions are created by the object being manipulated and the clutter in the environment. Current state-of-the-art approaches, which typically rely on model-fitting using Iterative Closest Point (ICP), fail in the presence of distracting data points and are unable to recover. Meanwhile, discriminative methods which are trained only to distinguish parts of the tracked object can also fail in these scenarios as data points from the occlusions are incorrectly classified as being from the manipulator. We instead propose to use the per-pixel data-to-model associations provided from a random forest to avoid local minima during model fitting. By training the random forest with artificial occlusions we can achieve increased robustness to occlusion and clutter present in the scene. We do this without specific knowledge about the type or location of the manipulated object. Our approach is demonstrated by using dense depth data from an RGB-D camera to track a robotic manipulator during manipulation and in presence of occlusions.
first_indexed 2024-03-07T02:40:22Z
format Conference item
id oxford-uuid:aa3d1a53-e834-4808-9833-0bba0789022a
institution University of Oxford
last_indexed 2024-03-07T02:40:22Z
publishDate 2018
publisher IEEE
record_format dspace
spelling oxford-uuid:aa3d1a53-e834-4808-9833-0bba0789022a2022-03-27T03:13:49ZVisual articulated tracking in the presence of occlusionsConference itemhttp://purl.org/coar/resource_type/c_5794uuid:aa3d1a53-e834-4808-9833-0bba0789022aSymplectic Elements at OxfordIEEE2018Rauch, CHospedales, TShotton, JFallon, MThis paper focuses on visual tracking of a robotic manipulator during manipulation. In this situation, tracking is prone to failure when visual distractions are created by the object being manipulated and the clutter in the environment. Current state-of-the-art approaches, which typically rely on model-fitting using Iterative Closest Point (ICP), fail in the presence of distracting data points and are unable to recover. Meanwhile, discriminative methods which are trained only to distinguish parts of the tracked object can also fail in these scenarios as data points from the occlusions are incorrectly classified as being from the manipulator. We instead propose to use the per-pixel data-to-model associations provided from a random forest to avoid local minima during model fitting. By training the random forest with artificial occlusions we can achieve increased robustness to occlusion and clutter present in the scene. We do this without specific knowledge about the type or location of the manipulated object. Our approach is demonstrated by using dense depth data from an RGB-D camera to track a robotic manipulator during manipulation and in presence of occlusions.
spellingShingle Rauch, C
Hospedales, T
Shotton, J
Fallon, M
Visual articulated tracking in the presence of occlusions
title Visual articulated tracking in the presence of occlusions
title_full Visual articulated tracking in the presence of occlusions
title_fullStr Visual articulated tracking in the presence of occlusions
title_full_unstemmed Visual articulated tracking in the presence of occlusions
title_short Visual articulated tracking in the presence of occlusions
title_sort visual articulated tracking in the presence of occlusions
work_keys_str_mv AT rauchc visualarticulatedtrackinginthepresenceofocclusions
AT hospedalest visualarticulatedtrackinginthepresenceofocclusions
AT shottonj visualarticulatedtrackinginthepresenceofocclusions
AT fallonm visualarticulatedtrackinginthepresenceofocclusions