Tracking objects with point clouds from vision and touch
We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based artic...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2017
|
Online Access: | http://hdl.handle.net/1721.1/111974 https://orcid.org/0000-0001-8916-1932 https://orcid.org/0000-0003-2222-6775 https://orcid.org/0000-0002-8712-7092 |
_version_ | 1826195751137443840 |
---|---|
author | Izatt, Gregory R. Mirano, Geronimo J. Adelson, Edward H Tedrake, Russell L |
author2 | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
author_facet | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Izatt, Gregory R. Mirano, Geronimo J. Adelson, Edward H Tedrake, Russell L |
author_sort | Izatt, Gregory R. |
collection | MIT |
description | We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified second-order update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot's end effector. |
first_indexed | 2024-09-23T10:14:52Z |
format | Article |
id | mit-1721.1/111974 |
institution | Massachusetts Institute of Technology |
last_indexed | 2024-09-23T10:14:52Z |
publishDate | 2017 |
publisher | Institute of Electrical and Electronics Engineers (IEEE) |
record_format | dspace |
spelling | mit-1721.1/1119742022-09-26T16:44:19Z Tracking objects with point clouds from vision and touch Izatt, Gregory R. Mirano, Geronimo J. Adelson, Edward H Tedrake, Russell L Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Izatt, Gregory R. Mirano, Geronimo J. Adelson, Edward H Tedrake, Russell L We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified second-order update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot's end effector. 2017-10-26T19:11:12Z 2017-10-26T19:11:12Z 2017-07 2017-10-25T16:17:22Z Article http://purl.org/eprint/type/ConferencePaper 978-1-5090-4633-1 http://hdl.handle.net/1721.1/111974 Izatt, Gregory et al. “Tracking Objects with Point Clouds from Vision and Touch.” 2017 IEEE International Conference on Robotics and Automation (ICRA) May 29 - June 3 2017, Singapore, Institute of Electrical and Electronics Engineers (IEEE), July 2017 © 2017 Institute of Electrical and Electronics Engineers (IEEE) https://orcid.org/0000-0001-8916-1932 https://orcid.org/0000-0003-2222-6775 https://orcid.org/0000-0002-8712-7092 http://dx.doi.org/10.1109/ICRA.2017.7989460 2017 IEEE International Conference on Robotics and Automation (ICRA) Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Institute of Electrical and Electronics Engineers (IEEE) MIT Web Domain |
spellingShingle | Izatt, Gregory R. Mirano, Geronimo J. Adelson, Edward H Tedrake, Russell L Tracking objects with point clouds from vision and touch |
title | Tracking objects with point clouds from vision and touch |
title_full | Tracking objects with point clouds from vision and touch |
title_fullStr | Tracking objects with point clouds from vision and touch |
title_full_unstemmed | Tracking objects with point clouds from vision and touch |
title_short | Tracking objects with point clouds from vision and touch |
title_sort | tracking objects with point clouds from vision and touch |
url | http://hdl.handle.net/1721.1/111974 https://orcid.org/0000-0001-8916-1932 https://orcid.org/0000-0003-2222-6775 https://orcid.org/0000-0002-8712-7092 |
work_keys_str_mv | AT izattgregoryr trackingobjectswithpointcloudsfromvisionandtouch AT miranogeronimoj trackingobjectswithpointcloudsfromvisionandtouch AT adelsonedwardh trackingobjectswithpointcloudsfromvisionandtouch AT tedrakerusselll trackingobjectswithpointcloudsfromvisionandtouch |