Teaching semantics and skills for human-robot collaboration
Recent advances in robotics allow for collaboration between humans and machines in performing tasks at home or in industrial settings without harming the life of the user. While humans can easily adapt to each other and work in team, it is not as trivial for robots. In their case, interaction skills...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
De Gruyter
2019-09-01
|
Series: | Paladyn |
Subjects: | |
Online Access: | https://doi.org/10.1515/pjbr-2019-0025 |
_version_ | 1797426471132725248 |
---|---|
author | Angleraud Alexandre Houbre Quentin Pieters Roel |
author_facet | Angleraud Alexandre Houbre Quentin Pieters Roel |
author_sort | Angleraud Alexandre |
collection | DOAJ |
description | Recent advances in robotics allow for collaboration between humans and machines in performing tasks at home or in industrial settings without harming the life of the user. While humans can easily adapt to each other and work in team, it is not as trivial for robots. In their case, interaction skills typically come at the cost of extensive programming and teaching. Besides, understanding the semantics of a task is necessary to work efficiently and react to changes in the task execution process. As a result, in order to achieve seamless collaboration, appropriate reasoning, learning skills and interaction capabilities are needed. For us humans, a cornerstone of our communication is language that we use to teach, coordinate and communicate. In this paper we thus propose a system allowing (i) to teach new action semantics based on the already available knowledge and (ii) to use natural language communication to resolve ambiguities that could arise while giving commands to the robot. Reasoning then allows new skills to be performed either autonomously or in collaboration with a human. Teaching occurs through a web application and motions are learned with physical demonstration of the robotic arm. We demonstrate the utility of our system in two scenarios and reflect upon the challenges that it introduces. |
first_indexed | 2024-03-09T08:31:42Z |
format | Article |
id | doaj.art-52b5eac5b39a44309f10f0c22de0c3b1 |
institution | Directory Open Access Journal |
issn | 2081-4836 |
language | English |
last_indexed | 2024-03-09T08:31:42Z |
publishDate | 2019-09-01 |
publisher | De Gruyter |
record_format | Article |
series | Paladyn |
spelling | doaj.art-52b5eac5b39a44309f10f0c22de0c3b12023-12-02T19:47:24ZengDe GruyterPaladyn2081-48362019-09-0110131832910.1515/pjbr-2019-0025pjbr-2019-0025Teaching semantics and skills for human-robot collaborationAngleraud Alexandre0Houbre Quentin1Pieters Roel2Automation Technology and Mechanical Engineering, Cognitive Robotics, Tampere University, 33720Tampere, FinlandAutomation Technology and Mechanical Engineering, Cognitive Robotics, Tampere University, 33720Tampere, FinlandAutomation Technology and Mechanical Engineering, Cognitive Robotics, Tampere University, 33720Tampere, FinlandRecent advances in robotics allow for collaboration between humans and machines in performing tasks at home or in industrial settings without harming the life of the user. While humans can easily adapt to each other and work in team, it is not as trivial for robots. In their case, interaction skills typically come at the cost of extensive programming and teaching. Besides, understanding the semantics of a task is necessary to work efficiently and react to changes in the task execution process. As a result, in order to achieve seamless collaboration, appropriate reasoning, learning skills and interaction capabilities are needed. For us humans, a cornerstone of our communication is language that we use to teach, coordinate and communicate. In this paper we thus propose a system allowing (i) to teach new action semantics based on the already available knowledge and (ii) to use natural language communication to resolve ambiguities that could arise while giving commands to the robot. Reasoning then allows new skills to be performed either autonomously or in collaboration with a human. Teaching occurs through a web application and motions are learned with physical demonstration of the robotic arm. We demonstrate the utility of our system in two scenarios and reflect upon the challenges that it introduces.https://doi.org/10.1515/pjbr-2019-0025human-robot interactioncognitive architectureknowledge representation and reasoningsymbol groundingsemiotics |
spellingShingle | Angleraud Alexandre Houbre Quentin Pieters Roel Teaching semantics and skills for human-robot collaboration Paladyn human-robot interaction cognitive architecture knowledge representation and reasoning symbol grounding semiotics |
title | Teaching semantics and skills for human-robot collaboration |
title_full | Teaching semantics and skills for human-robot collaboration |
title_fullStr | Teaching semantics and skills for human-robot collaboration |
title_full_unstemmed | Teaching semantics and skills for human-robot collaboration |
title_short | Teaching semantics and skills for human-robot collaboration |
title_sort | teaching semantics and skills for human robot collaboration |
topic | human-robot interaction cognitive architecture knowledge representation and reasoning symbol grounding semiotics |
url | https://doi.org/10.1515/pjbr-2019-0025 |
work_keys_str_mv | AT angleraudalexandre teachingsemanticsandskillsforhumanrobotcollaboration AT houbrequentin teachingsemanticsandskillsforhumanrobotcollaboration AT pietersroel teachingsemanticsandskillsforhumanrobotcollaboration |