Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application

Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a gro...

Full description

Bibliographic Details
Main Authors: Daria Piacun, Tudor B. Ionescu, Sebastian Schlund
Format: Article
Language:English
Published: MDPI AG 2021-11-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/11/22/10903
_version_ 1797511277921173504
author Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
author_facet Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
author_sort Daria Piacun
collection DOAJ
description Industrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called <i>Assembly</i>. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online.
first_indexed 2024-03-10T05:43:06Z
format Article
id doaj.art-440542a8f55d41d6844bf133fc6d40d2
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-03-10T05:43:06Z
publishDate 2021-11-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-440542a8f55d41d6844bf133fc6d40d22023-11-22T22:20:41ZengMDPI AGApplied Sciences2076-34172021-11-0111221090310.3390/app112210903Crowdsourced Evaluation of Robot Programming Environments: Methodology and ApplicationDaria Piacun0Tudor B. Ionescu1Sebastian Schlund2Human-Machine Interaction Group, Vienna University of Technology, 1040 Vienna, AustriaHuman-Machine Interaction Group, Vienna University of Technology, 1040 Vienna, AustriaHuman-Machine Interaction Group, Vienna University of Technology, 1040 Vienna, AustriaIndustrial robot programming tools increasingly rely on graphical interfaces, which aim at rendering the programming task more accessible to a wide variety of users. The usability of such tools is currently being evaluated in controlled environments, such as laboratories or companies, in which a group of participants is asked to carry out several tasks using the tool and then fill out a standardized questionnaire. In this context, this paper proposes and evaluates an alternative evaluation methodology, which leverages online crowdsourcing platforms to produce the same results as face-to-face evaluations. We applied the proposed framework in the evaluation of a web-based industrial robot programming tool called <i>Assembly</i>. Our results suggest that crowdsourcing facilitates a cost-effective, result-oriented, and reusable methodology for performing user studies anonymously and online.https://www.mdpi.com/2076-3417/11/22/10903robot programminguser interface evaluationcrowdsourcing
spellingShingle Daria Piacun
Tudor B. Ionescu
Sebastian Schlund
Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
Applied Sciences
robot programming
user interface evaluation
crowdsourcing
title Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_full Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_fullStr Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_full_unstemmed Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_short Crowdsourced Evaluation of Robot Programming Environments: Methodology and Application
title_sort crowdsourced evaluation of robot programming environments methodology and application
topic robot programming
user interface evaluation
crowdsourcing
url https://www.mdpi.com/2076-3417/11/22/10903
work_keys_str_mv AT dariapiacun crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication
AT tudorbionescu crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication
AT sebastianschlund crowdsourcedevaluationofrobotprogrammingenvironmentsmethodologyandapplication