Imitation learning from demonstration videos
Imitation learning is a challenging and meaningful task to encode prior knowl- edge to provide a motion control policy for guiding robot movement and trajec- tory autonomously to complete specified assignment with a given current state. However, effective translation from prior knowledge to control...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/177091 |
_version_ | 1826121882811760640 |
---|---|
author | Zeng, Jingbo |
author2 | Tan Yap Peng |
author_facet | Tan Yap Peng Zeng, Jingbo |
author_sort | Zeng, Jingbo |
collection | NTU |
description | Imitation learning is a challenging and meaningful task to encode prior knowl- edge to provide a motion control policy for guiding robot movement and trajec- tory autonomously to complete specified assignment with a given current state. However, effective translation from prior knowledge to control rules remains relatively unexplored. In this dissertation, we introduce an imitation learning method for encoding prior knowledge from dexterous manipulation demonstra- tion videos. Instead of adopting behavior cloning or pure RL algorithms, our model considers two online RL algorithms: 1) Demo Augmented Policy Gra- dients (DAPG) and 2) Generative Adversarial Imitation Learning (GAIL). With the requirements for encoding the finger action in demonstrations, we selected MANO as the baseline of hand pose estimation, and designed a SuperPoint- based module to optimize detection results. Quantitative experimental results show that our framework can exploit hand pose estimation on different dataset effectively and use imitation learning to achieve great overall performance on three defined tasks. Moreover, it has good generalization ability when deployed on unseen objects. Some visual results show that the proposed framework can be applied combining with prior knowledge from demonstration videos, which provides a possible solution for robot’s imitating human behaviors. |
first_indexed | 2024-10-01T05:39:46Z |
format | Thesis-Master by Coursework |
id | ntu-10356/177091 |
institution | Nanyang Technological University |
language | English |
last_indexed | 2024-10-01T05:39:46Z |
publishDate | 2024 |
publisher | Nanyang Technological University |
record_format | dspace |
spelling | ntu-10356/1770912024-05-24T15:56:09Z Imitation learning from demonstration videos Zeng, Jingbo Tan Yap Peng School of Electrical and Electronic Engineering EYPTan@ntu.edu.sg Engineering Imitation learning Hand pose estimation Dexterous manipulation Imitation learning is a challenging and meaningful task to encode prior knowl- edge to provide a motion control policy for guiding robot movement and trajec- tory autonomously to complete specified assignment with a given current state. However, effective translation from prior knowledge to control rules remains relatively unexplored. In this dissertation, we introduce an imitation learning method for encoding prior knowledge from dexterous manipulation demonstra- tion videos. Instead of adopting behavior cloning or pure RL algorithms, our model considers two online RL algorithms: 1) Demo Augmented Policy Gra- dients (DAPG) and 2) Generative Adversarial Imitation Learning (GAIL). With the requirements for encoding the finger action in demonstrations, we selected MANO as the baseline of hand pose estimation, and designed a SuperPoint- based module to optimize detection results. Quantitative experimental results show that our framework can exploit hand pose estimation on different dataset effectively and use imitation learning to achieve great overall performance on three defined tasks. Moreover, it has good generalization ability when deployed on unseen objects. Some visual results show that the proposed framework can be applied combining with prior knowledge from demonstration videos, which provides a possible solution for robot’s imitating human behaviors. Master's degree 2024-05-22T23:48:05Z 2024-05-22T23:48:05Z 2024 Thesis-Master by Coursework Zeng, J. (2024). Imitation learning from demonstration videos. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/177091 https://hdl.handle.net/10356/177091 en application/pdf Nanyang Technological University |
spellingShingle | Engineering Imitation learning Hand pose estimation Dexterous manipulation Zeng, Jingbo Imitation learning from demonstration videos |
title | Imitation learning from demonstration videos |
title_full | Imitation learning from demonstration videos |
title_fullStr | Imitation learning from demonstration videos |
title_full_unstemmed | Imitation learning from demonstration videos |
title_short | Imitation learning from demonstration videos |
title_sort | imitation learning from demonstration videos |
topic | Engineering Imitation learning Hand pose estimation Dexterous manipulation |
url | https://hdl.handle.net/10356/177091 |
work_keys_str_mv | AT zengjingbo imitationlearningfromdemonstrationvideos |