Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation
Tactile perception is key for robotics applications such as manipulation. However, tactile data collection is time-consuming, especially when compared to vision. This limits the use of the tactile modality in machine learning solutions in robotics. In this paper, we propose a generative model to sim...
Main Authors: | , , , , |
---|---|
Format: | Conference item |
Language: | English |
Published: |
Journal of Machine Learning Research
2023
|
_version_ | 1797110169247678464 |
---|---|
author | Zhong, S Albini, A Parker Jones, O Maiolino, P Posner, H |
author_facet | Zhong, S Albini, A Parker Jones, O Maiolino, P Posner, H |
author_sort | Zhong, S |
collection | OXFORD |
description | Tactile perception is key for robotics applications such as manipulation.
However, tactile data collection is time-consuming, especially when compared to
vision. This limits the use of the tactile modality in machine learning solutions
in robotics. In this paper, we propose a generative model to simulate realistic
tactile sensory data for use in downstream tasks. Starting with easily-obtained
camera images, we train Neural Radiance Fields (NeRF) for objects of interest.
We then use NeRF-rendered RGB-D images as inputs to a conditional Generative
Adversarial Network model (cGAN) to generate tactile images from desired orientations. We evaluate the generated data quantitatively using the Structural Similarity Index and Mean Squared Error metrics, and also using a tactile classification
task both in simulation and in the real world. Results show that by augmenting
a manually collected dataset, the generated data is able to increase classification
accuracy by around 10%. In addition, we demonstrate that our model is able to
transfer from one tactile sensor to another with a small fine-tuning dataset. |
first_indexed | 2024-03-07T07:51:45Z |
format | Conference item |
id | oxford-uuid:637cfe2d-7ee3-4c96-bc62-1a71752f97b9 |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T07:51:45Z |
publishDate | 2023 |
publisher | Journal of Machine Learning Research |
record_format | dspace |
spelling | oxford-uuid:637cfe2d-7ee3-4c96-bc62-1a71752f97b92023-07-21T10:07:06ZTouching a NeRF: leveraging neural radiance fields for tactile sensory data generationConference itemhttp://purl.org/coar/resource_type/c_5794uuid:637cfe2d-7ee3-4c96-bc62-1a71752f97b9EnglishSymplectic ElementsJournal of Machine Learning Research2023Zhong, SAlbini, AParker Jones, OMaiolino, PPosner, HTactile perception is key for robotics applications such as manipulation. However, tactile data collection is time-consuming, especially when compared to vision. This limits the use of the tactile modality in machine learning solutions in robotics. In this paper, we propose a generative model to simulate realistic tactile sensory data for use in downstream tasks. Starting with easily-obtained camera images, we train Neural Radiance Fields (NeRF) for objects of interest. We then use NeRF-rendered RGB-D images as inputs to a conditional Generative Adversarial Network model (cGAN) to generate tactile images from desired orientations. We evaluate the generated data quantitatively using the Structural Similarity Index and Mean Squared Error metrics, and also using a tactile classification task both in simulation and in the real world. Results show that by augmenting a manually collected dataset, the generated data is able to increase classification accuracy by around 10%. In addition, we demonstrate that our model is able to transfer from one tactile sensor to another with a small fine-tuning dataset. |
spellingShingle | Zhong, S Albini, A Parker Jones, O Maiolino, P Posner, H Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title | Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title_full | Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title_fullStr | Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title_full_unstemmed | Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title_short | Touching a NeRF: leveraging neural radiance fields for tactile sensory data generation |
title_sort | touching a nerf leveraging neural radiance fields for tactile sensory data generation |
work_keys_str_mv | AT zhongs touchinganerfleveragingneuralradiancefieldsfortactilesensorydatageneration AT albinia touchinganerfleveragingneuralradiancefieldsfortactilesensorydatageneration AT parkerjoneso touchinganerfleveragingneuralradiancefieldsfortactilesensorydatageneration AT maiolinop touchinganerfleveragingneuralradiancefieldsfortactilesensorydatageneration AT posnerh touchinganerfleveragingneuralradiancefieldsfortactilesensorydatageneration |