Gesture commands for controlling high-level UAV behavior
Abstract Directing groups of unmanned air vehicles (UAVs) is a task that typically requires the full attention of several operators. This can be prohibitive in situations where an operator must pay attention to their surroundings. In this paper we present a gesture device that assists operators in c...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2021-05-01
|
Series: | SN Applied Sciences |
Subjects: | |
Online Access: | https://doi.org/10.1007/s42452-021-04583-8 |
_version_ | 1818562839827185664 |
---|---|
author | John Akagi T. Devon Morris Brady Moon Xingguang Chen Cameron K. Peterson |
author_facet | John Akagi T. Devon Morris Brady Moon Xingguang Chen Cameron K. Peterson |
author_sort | John Akagi |
collection | DOAJ |
description | Abstract Directing groups of unmanned air vehicles (UAVs) is a task that typically requires the full attention of several operators. This can be prohibitive in situations where an operator must pay attention to their surroundings. In this paper we present a gesture device that assists operators in commanding UAVs in focus-constrained environments. The operator influences the UAVs’ behavior by using intuitive hand gesture movements. Gestures are captured using an accelerometer and gyroscope and then classified using a logistic regression model. Ten gestures were chosen to provide behaviors for a group of fixed-wing UAVs. These behaviors specified various searching, following, and tracking patterns that could be used in a dynamic environment. A novel variant of the Monte Carlo Tree Search algorithm was developed to autonomously plan the paths of the cooperating UAVs. These autonomy algorithms were executed when their corresponding gesture was recognized by the gesture device. The gesture device was trained to classify the ten gestures and accurately identified them 95% of the time. Each of the behaviors associated with the gestures was tested in hardware-in-the-loop simulations and the ability to dynamically switch between them was demonstrated. The results show that the system can be used as a natural interface to assist an operator in directing a fleet of UAVs. Article highlights A gesture device was created that enables operators to command a group of UAVs in focus-constrained environments. Each gesture triggers high-level commands that direct a UAV group to execute complex behaviors. Software simulations and hardware-in-the-loop testing shows the device is effective in directing UAV groups. |
first_indexed | 2024-12-14T01:09:00Z |
format | Article |
id | doaj.art-fc4cc1ccc905476ea8c891168869684a |
institution | Directory Open Access Journal |
issn | 2523-3963 2523-3971 |
language | English |
last_indexed | 2024-12-14T01:09:00Z |
publishDate | 2021-05-01 |
publisher | Springer |
record_format | Article |
series | SN Applied Sciences |
spelling | doaj.art-fc4cc1ccc905476ea8c891168869684a2022-12-21T23:22:50ZengSpringerSN Applied Sciences2523-39632523-39712021-05-013612310.1007/s42452-021-04583-8Gesture commands for controlling high-level UAV behaviorJohn Akagi0T. Devon Morris1Brady Moon2Xingguang Chen3Cameron K. Peterson4Department of Mechanical Engineering, Brigham Young UniversityDepartment of Computer and Electrical Engineering, Brigham Young UniversityDepartment of Computer and Electrical Engineering, Brigham Young UniversitySchool of Electronics and Information Technology, Sun Yat-sen UniversityDepartment of Computer and Electrical Engineering, Brigham Young UniversityAbstract Directing groups of unmanned air vehicles (UAVs) is a task that typically requires the full attention of several operators. This can be prohibitive in situations where an operator must pay attention to their surroundings. In this paper we present a gesture device that assists operators in commanding UAVs in focus-constrained environments. The operator influences the UAVs’ behavior by using intuitive hand gesture movements. Gestures are captured using an accelerometer and gyroscope and then classified using a logistic regression model. Ten gestures were chosen to provide behaviors for a group of fixed-wing UAVs. These behaviors specified various searching, following, and tracking patterns that could be used in a dynamic environment. A novel variant of the Monte Carlo Tree Search algorithm was developed to autonomously plan the paths of the cooperating UAVs. These autonomy algorithms were executed when their corresponding gesture was recognized by the gesture device. The gesture device was trained to classify the ten gestures and accurately identified them 95% of the time. Each of the behaviors associated with the gestures was tested in hardware-in-the-loop simulations and the ability to dynamically switch between them was demonstrated. The results show that the system can be used as a natural interface to assist an operator in directing a fleet of UAVs. Article highlights A gesture device was created that enables operators to command a group of UAVs in focus-constrained environments. Each gesture triggers high-level commands that direct a UAV group to execute complex behaviors. Software simulations and hardware-in-the-loop testing shows the device is effective in directing UAV groups.https://doi.org/10.1007/s42452-021-04583-8Autonomous vehiclesCooperative controlHuman–robot interactionGesture interface device |
spellingShingle | John Akagi T. Devon Morris Brady Moon Xingguang Chen Cameron K. Peterson Gesture commands for controlling high-level UAV behavior SN Applied Sciences Autonomous vehicles Cooperative control Human–robot interaction Gesture interface device |
title | Gesture commands for controlling high-level UAV behavior |
title_full | Gesture commands for controlling high-level UAV behavior |
title_fullStr | Gesture commands for controlling high-level UAV behavior |
title_full_unstemmed | Gesture commands for controlling high-level UAV behavior |
title_short | Gesture commands for controlling high-level UAV behavior |
title_sort | gesture commands for controlling high level uav behavior |
topic | Autonomous vehicles Cooperative control Human–robot interaction Gesture interface device |
url | https://doi.org/10.1007/s42452-021-04583-8 |
work_keys_str_mv | AT johnakagi gesturecommandsforcontrollinghighleveluavbehavior AT tdevonmorris gesturecommandsforcontrollinghighleveluavbehavior AT bradymoon gesturecommandsforcontrollinghighleveluavbehavior AT xingguangchen gesturecommandsforcontrollinghighleveluavbehavior AT cameronkpeterson gesturecommandsforcontrollinghighleveluavbehavior |