Tracking objects with point clouds from vision and touch
We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based artic...
Main Authors: | Izatt, Gregory R., Mirano, Geronimo J., Adelson, Edward H, Tedrake, Russell L |
---|---|
Other Authors: | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
Format: | Article |
Published: |
Institute of Electrical and Electronics Engineers (IEEE)
2017
|
Online Access: | http://hdl.handle.net/1721.1/111974 https://orcid.org/0000-0001-8916-1932 https://orcid.org/0000-0003-2222-6775 https://orcid.org/0000-0002-8712-7092 |
Similar Items
-
Robust object pose estimation with point clouds from vision and touch
by: Izatt, Gregory (Gregory Russell)
Published: (2017) -
Globally Optimal Object Pose Estimation in Point Clouds with Mixed-Integer Programming
by: Izatt, Gregory R., et al.
Published: (2021) -
Localization and tracking of parameterized objects in point clouds
by: Truax, Robert D. (Robert Denison)
Published: (2011) -
Vision assisted object detection In LIDAR point cloud
by: Yuen, Wei Chee
Published: (2022) -
Estimating object hardness with a GelSight touch sensor
by: Yuan, Wenzhen, et al.
Published: (2017)