A methodology to model and simulate customized realistic anthropomorphic robotic hands
When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Conference Paper |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/138936 |
_version_ | 1826124917543796736 |
---|---|
author | Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin |
author2 | School of Computer Science and Engineering |
author_facet | School of Computer Science and Engineering Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin |
author_sort | Tian, Li |
collection | NTU |
description | When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes are long, and it is difficult to duplicate the geometry shape of a human hand. To solve these two issues, this paper presents a simple and effective method that combines 3D printing and digitization techniques to create a 3D printable cable-driven robotic hand from scanning a physical hand. The method involves segmenting the 3D scanned hand model, adding joints, and converting it into a 3D printable model. Comparing to other robotic solutions, our solution retains more than 90% geometry information of a human hand1, which is attained from 3D scanning. Our modelling progress takes around 15 minutes that include 10 minutes of 3D scanning and five minutes for changing the scanned model to an articulated model by running our algorithm. Compared to other articulated modelling solutions [19, 20], our solution is compatible with an actuation system which provides our robotic hand with the ability to mimic different gestures. We have also developed a way of representing hand skeletons based on the hand anthropometric. As a proof of concept, we demonstrate our robotic hand's performance in the grasping experiments. |
first_indexed | 2024-10-01T06:28:12Z |
format | Conference Paper |
id | ntu-10356/138936 |
institution | Nanyang Technological University |
language | English |
last_indexed | 2024-10-01T06:28:12Z |
publishDate | 2020 |
record_format | dspace |
spelling | ntu-10356/1389362020-09-26T21:53:04Z A methodology to model and simulate customized realistic anthropomorphic robotic hands Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin School of Computer Science and Engineering CGI 2018: Proceedings of Computer Graphics International 2018 Institute for Media Innovation (IMI) Engineering::Computer science and engineering Robotics Embedded Systems When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes are long, and it is difficult to duplicate the geometry shape of a human hand. To solve these two issues, this paper presents a simple and effective method that combines 3D printing and digitization techniques to create a 3D printable cable-driven robotic hand from scanning a physical hand. The method involves segmenting the 3D scanned hand model, adding joints, and converting it into a 3D printable model. Comparing to other robotic solutions, our solution retains more than 90% geometry information of a human hand1, which is attained from 3D scanning. Our modelling progress takes around 15 minutes that include 10 minutes of 3D scanning and five minutes for changing the scanned model to an articulated model by running our algorithm. Compared to other articulated modelling solutions [19, 20], our solution is compatible with an actuation system which provides our robotic hand with the ability to mimic different gestures. We have also developed a way of representing hand skeletons based on the hand anthropometric. As a proof of concept, we demonstrate our robotic hand's performance in the grasping experiments. NRF (Natl Research Foundation, S’pore) Accepted version 2020-05-14T04:09:51Z 2020-05-14T04:09:51Z 2018 Conference Paper Tian, L., Magnenat-Thalmann, N., Thalmann, D., & Zheng, J. (2018). A methodology to model and simulate customized realistic anthropomorphic robotic hands. Proceedings of Computer Graphics International 2018, 153-162. doi:10.1145/3208159.3208182 https://hdl.handle.net/10356/138936 10.1145/3208159.3208182 153 162 en © 2018 Association for Computing Machinery. All rights reserved. This paper was published in CGI 2018: Proceedings of Computer Graphics International 2018 and is made available with permission of Association for Computing Machinery. application/pdf |
spellingShingle | Engineering::Computer science and engineering Robotics Embedded Systems Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title | A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_full | A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_fullStr | A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_full_unstemmed | A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_short | A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_sort | methodology to model and simulate customized realistic anthropomorphic robotic hands |
topic | Engineering::Computer science and engineering Robotics Embedded Systems |
url | https://hdl.handle.net/10356/138936 |
work_keys_str_mv | AT tianli amethodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT magnenatthalmannnadia amethodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT thalmanndaniel amethodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT zhengjianmin amethodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT tianli methodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT magnenatthalmannnadia methodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT thalmanndaniel methodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands AT zhengjianmin methodologytomodelandsimulatecustomizedrealisticanthropomorphicrobotichands |