Visual Transfer Learning for Robotic Manipulation

Humans are remarkable at manipulating unfamiliar objects. For the past decades of robotics, tremendous efforts have been dedicated to endow robot manipulation systems with such capabilities. As classic solutions typically require prior knowledge of the objects (e.g., 3D CAD models) which are not ava...

Full description

Bibliographic Details
Main Author: Lin, Yen-Chen
Other Authors: Isola, Phillip J.
Format: Thesis
Published: Massachusetts Institute of Technology 2022
Online Access:https://hdl.handle.net/1721.1/139048
_version_ 1826207499516116992
author Lin, Yen-Chen
author2 Isola, Phillip J.
author_facet Isola, Phillip J.
Lin, Yen-Chen
author_sort Lin, Yen-Chen
collection MIT
description Humans are remarkable at manipulating unfamiliar objects. For the past decades of robotics, tremendous efforts have been dedicated to endow robot manipulation systems with such capabilities. As classic solutions typically require prior knowledge of the objects (e.g., 3D CAD models) which are not available in the unstructured environments, data-driven solutions that learn from robot-environment interactions (e.g., trial and error) have emerged as a promising approach for autonomously acquiring complex skills for manipulation. For data-driven methods, the ability to do more with less data is incredibly important, since data collection through physical interaction between the robots and the environment can be both time consuming and expensive. In this thesis, we develop transfer learning algorithms for robotic manipulation in order to reduce the amount of robot-environment interactions needed to adapt to different environments. With real robot hardware, we show that our algorithms enable robots to learn to pick and grasp arbitrary objects with 10 minutes of trial and error, and help robots learn to push unfamiliar objects with 5 interactions.
first_indexed 2024-09-23T13:50:35Z
format Thesis
id mit-1721.1/139048
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T13:50:35Z
publishDate 2022
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1390482022-01-15T03:25:01Z Visual Transfer Learning for Robotic Manipulation Lin, Yen-Chen Isola, Phillip J. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Humans are remarkable at manipulating unfamiliar objects. For the past decades of robotics, tremendous efforts have been dedicated to endow robot manipulation systems with such capabilities. As classic solutions typically require prior knowledge of the objects (e.g., 3D CAD models) which are not available in the unstructured environments, data-driven solutions that learn from robot-environment interactions (e.g., trial and error) have emerged as a promising approach for autonomously acquiring complex skills for manipulation. For data-driven methods, the ability to do more with less data is incredibly important, since data collection through physical interaction between the robots and the environment can be both time consuming and expensive. In this thesis, we develop transfer learning algorithms for robotic manipulation in order to reduce the amount of robot-environment interactions needed to adapt to different environments. With real robot hardware, we show that our algorithms enable robots to learn to pick and grasp arbitrary objects with 10 minutes of trial and error, and help robots learn to push unfamiliar objects with 5 interactions. S.M. 2022-01-14T14:46:45Z 2022-01-14T14:46:45Z 2021-06 2021-06-24T19:25:32.642Z Thesis https://hdl.handle.net/1721.1/139048 In Copyright - Educational Use Permitted Copyright MIT http://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology
spellingShingle Lin, Yen-Chen
Visual Transfer Learning for Robotic Manipulation
title Visual Transfer Learning for Robotic Manipulation
title_full Visual Transfer Learning for Robotic Manipulation
title_fullStr Visual Transfer Learning for Robotic Manipulation
title_full_unstemmed Visual Transfer Learning for Robotic Manipulation
title_short Visual Transfer Learning for Robotic Manipulation
title_sort visual transfer learning for robotic manipulation
url https://hdl.handle.net/1721.1/139048
work_keys_str_mv AT linyenchen visualtransferlearningforroboticmanipulation