Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions

Assistant based speech recognition (ABSR) prototypes for air traffic controllers have demonstrated to reduce controller workload and aircraft flight times as a result. However, two aspects of ABSR could enhance benefits, i.e., (1) the predicted controller commands that speech recognition engines use...

Full description

Bibliographic Details
Main Authors: Oliver Ohneiser, Jyothsna Adamala, Ioan-Teodor Salomea
Format: Article
Language:English
Published: MDPI AG 2021-09-01
Series:Aerospace
Subjects:
Online Access:https://www.mdpi.com/2226-4310/8/9/245
_version_ 1797520709806718976
author Oliver Ohneiser
Jyothsna Adamala
Ioan-Teodor Salomea
author_facet Oliver Ohneiser
Jyothsna Adamala
Ioan-Teodor Salomea
author_sort Oliver Ohneiser
collection DOAJ
description Assistant based speech recognition (ABSR) prototypes for air traffic controllers have demonstrated to reduce controller workload and aircraft flight times as a result. However, two aspects of ABSR could enhance benefits, i.e., (1) the predicted controller commands that speech recognition engines use can be more accurate, and (2) the confirmation process of ABSR recognition output, such as callsigns, command types, and values by the controller, can be less intrusive. Both tasks can be supported by unobtrusive eye- and mouse-tracking when using operators’ gaze and interaction data. First, probabilities for predicted commands should consider controllers’ visual focus on the situation data display. Controllers will more likely give commands to aircraft that they focus on or where there was a mouse interaction on the display. Furthermore, they will more likely give certain command types depending on the characteristics of multiple aircraft being scanned. Second, it can be determined via eye-tracking instead of additional mouse clicks if the displayed ABSR output has been checked by the controller and remains uncorrected for a certain amount of time. Then, the output is assumed to be correct and is usable by other air traffic control systems, e.g., short-term conflict alert. If the ABSR output remains unchecked, an attention guidance functionality triggers different escalation levels to display visual cues. In a one-shot experimental case study with two controllers for the two implemented techniques, (1) command prediction probabilities improved by a factor of four, (2) prediction error rates based on an accuracy metric for three most-probable aircraft decreased by a factor of 25 when combining eye- and mouse-tracking data, and (3) visual confirmation of ABSR output promises to be an alternative for manual confirmation.
first_indexed 2024-03-10T08:00:28Z
format Article
id doaj.art-d883bd7f66254933a0ae8fbf1ee7158f
institution Directory Open Access Journal
issn 2226-4310
language English
last_indexed 2024-03-10T08:00:28Z
publishDate 2021-09-01
publisher MDPI AG
record_format Article
series Aerospace
spelling doaj.art-d883bd7f66254933a0ae8fbf1ee7158f2023-11-22T11:34:23ZengMDPI AGAerospace2226-43102021-09-018924510.3390/aerospace8090245Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working PositionsOliver Ohneiser0Jyothsna Adamala1Ioan-Teodor Salomea2German Aerospace Center (DLR), Institute of Flight Guidance, Lilienthalplatz 7, 38108 Braunschweig, GermanyFaculty of Informatics, Automotive Software Engineering, Technische Universität Chemnitz, Straße der Nationen 62, 09111 Chemnitz, GermanyFaculty of Aerospace Engineering,“Politehnica” University of Bucharest, Str. Gh. Polizu No. 1, 1st District, 010737 Bucharest, RomaniaAssistant based speech recognition (ABSR) prototypes for air traffic controllers have demonstrated to reduce controller workload and aircraft flight times as a result. However, two aspects of ABSR could enhance benefits, i.e., (1) the predicted controller commands that speech recognition engines use can be more accurate, and (2) the confirmation process of ABSR recognition output, such as callsigns, command types, and values by the controller, can be less intrusive. Both tasks can be supported by unobtrusive eye- and mouse-tracking when using operators’ gaze and interaction data. First, probabilities for predicted commands should consider controllers’ visual focus on the situation data display. Controllers will more likely give commands to aircraft that they focus on or where there was a mouse interaction on the display. Furthermore, they will more likely give certain command types depending on the characteristics of multiple aircraft being scanned. Second, it can be determined via eye-tracking instead of additional mouse clicks if the displayed ABSR output has been checked by the controller and remains uncorrected for a certain amount of time. Then, the output is assumed to be correct and is usable by other air traffic control systems, e.g., short-term conflict alert. If the ABSR output remains unchecked, an attention guidance functionality triggers different escalation levels to display visual cues. In a one-shot experimental case study with two controllers for the two implemented techniques, (1) command prediction probabilities improved by a factor of four, (2) prediction error rates based on an accuracy metric for three most-probable aircraft decreased by a factor of 25 when combining eye- and mouse-tracking data, and (3) visual confirmation of ABSR output promises to be an alternative for manual confirmation.https://www.mdpi.com/2226-4310/8/9/245air traffic controllerhuman machine interactionmultimodalityeye-trackingmouse-trackingautomatic speech recognition
spellingShingle Oliver Ohneiser
Jyothsna Adamala
Ioan-Teodor Salomea
Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
Aerospace
air traffic controller
human machine interaction
multimodality
eye-tracking
mouse-tracking
automatic speech recognition
title Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
title_full Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
title_fullStr Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
title_full_unstemmed Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
title_short Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working Positions
title_sort integrating eye and mouse tracking with assistant based speech recognition for interaction at controller working positions
topic air traffic controller
human machine interaction
multimodality
eye-tracking
mouse-tracking
automatic speech recognition
url https://www.mdpi.com/2226-4310/8/9/245
work_keys_str_mv AT oliverohneiser integratingeyeandmousetrackingwithassistantbasedspeechrecognitionforinteractionatcontrollerworkingpositions
AT jyothsnaadamala integratingeyeandmousetrackingwithassistantbasedspeechrecognitionforinteractionatcontrollerworkingpositions
AT ioanteodorsalomea integratingeyeandmousetrackingwithassistantbasedspeechrecognitionforinteractionatcontrollerworkingpositions