Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics

With the development of the autopilot system, the main task of a pilot has changed from controlling the aircraft to supervising the autopilot system and making critical decisions. Therefore, the human–machine interaction system needs to be improved accordingly. A key step to improving the human–mach...

Full description

Bibliographic Details
Main Authors: Yuhan Li, Ke Li, Shaofan Wang, Xiaodan Chen, Dongsheng Wen
Format: Article
Language:English
Published: MDPI AG 2022-06-01
Series:Biosensors
Subjects:
Online Access:https://www.mdpi.com/2079-6374/12/6/404
_version_ 1797489493127725056
author Yuhan Li
Ke Li
Shaofan Wang
Xiaodan Chen
Dongsheng Wen
author_facet Yuhan Li
Ke Li
Shaofan Wang
Xiaodan Chen
Dongsheng Wen
author_sort Yuhan Li
collection DOAJ
description With the development of the autopilot system, the main task of a pilot has changed from controlling the aircraft to supervising the autopilot system and making critical decisions. Therefore, the human–machine interaction system needs to be improved accordingly. A key step to improving the human–machine interaction system is to improve its understanding of the pilots’ status, including fatigue, stress, workload, etc. Monitoring pilots’ status can effectively prevent human error and achieve optimal human–machine collaboration. As such, there is a need to recognize pilots’ status and predict the behaviors responsible for changes of state. For this purpose, in this study, 14 Air Force cadets fly in an F-35 Lightning II Joint Strike Fighter simulator through a series of maneuvers involving takeoff, level flight, turn and hover, roll, somersault, and stall. Electro cardio (ECG), myoelectricity (EMG), galvanic skin response (GSR), respiration (RESP), and skin temperature (SKT) measurements are derived through wearable physiological data collection devices. Physiological indicators influenced by the pilot’s behavioral status are objectively analyzed. Multi-modality fusion technology (MTF) is adopted to fuse these data in the feature layer. Additionally, four classifiers are integrated to identify pilots’ behaviors in the strategy layer. The results indicate that MTF can help to recognize pilot behavior in a more comprehensive and precise way.
first_indexed 2024-03-10T00:17:24Z
format Article
id doaj.art-d947d7713aa74cd8b565bcaeaafb26a3
institution Directory Open Access Journal
issn 2079-6374
language English
last_indexed 2024-03-10T00:17:24Z
publishDate 2022-06-01
publisher MDPI AG
record_format Article
series Biosensors
spelling doaj.art-d947d7713aa74cd8b565bcaeaafb26a32023-11-23T15:49:02ZengMDPI AGBiosensors2079-63742022-06-0112640410.3390/bios12060404Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological CharacteristicsYuhan Li0Ke Li1Shaofan Wang2Xiaodan Chen3Dongsheng Wen4National key Laboratory of Human Machine and Environment Engineering, School of Aeronautical Science and Engineering, Beihang University, Beijing 100191, ChinaNational key Laboratory of Human Machine and Environment Engineering, School of Aeronautical Science and Engineering, Beihang University, Beijing 100191, ChinaNational key Laboratory of Human Machine and Environment Engineering, School of Aeronautical Science and Engineering, Beihang University, Beijing 100191, ChinaNational key Laboratory of Human Machine and Environment Engineering, School of Aeronautical Science and Engineering, Beihang University, Beijing 100191, ChinaNational key Laboratory of Human Machine and Environment Engineering, School of Aeronautical Science and Engineering, Beihang University, Beijing 100191, ChinaWith the development of the autopilot system, the main task of a pilot has changed from controlling the aircraft to supervising the autopilot system and making critical decisions. Therefore, the human–machine interaction system needs to be improved accordingly. A key step to improving the human–machine interaction system is to improve its understanding of the pilots’ status, including fatigue, stress, workload, etc. Monitoring pilots’ status can effectively prevent human error and achieve optimal human–machine collaboration. As such, there is a need to recognize pilots’ status and predict the behaviors responsible for changes of state. For this purpose, in this study, 14 Air Force cadets fly in an F-35 Lightning II Joint Strike Fighter simulator through a series of maneuvers involving takeoff, level flight, turn and hover, roll, somersault, and stall. Electro cardio (ECG), myoelectricity (EMG), galvanic skin response (GSR), respiration (RESP), and skin temperature (SKT) measurements are derived through wearable physiological data collection devices. Physiological indicators influenced by the pilot’s behavioral status are objectively analyzed. Multi-modality fusion technology (MTF) is adopted to fuse these data in the feature layer. Additionally, four classifiers are integrated to identify pilots’ behaviors in the strategy layer. The results indicate that MTF can help to recognize pilot behavior in a more comprehensive and precise way.https://www.mdpi.com/2079-6374/12/6/404MTFphysiologicalbehavior recognitionpilotmachine learningmulti-modal
spellingShingle Yuhan Li
Ke Li
Shaofan Wang
Xiaodan Chen
Dongsheng Wen
Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
Biosensors
MTF
physiological
behavior recognition
pilot
machine learning
multi-modal
title Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
title_full Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
title_fullStr Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
title_full_unstemmed Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
title_short Pilot Behavior Recognition Based on Multi-Modality Fusion Technology Using Physiological Characteristics
title_sort pilot behavior recognition based on multi modality fusion technology using physiological characteristics
topic MTF
physiological
behavior recognition
pilot
machine learning
multi-modal
url https://www.mdpi.com/2079-6374/12/6/404
work_keys_str_mv AT yuhanli pilotbehaviorrecognitionbasedonmultimodalityfusiontechnologyusingphysiologicalcharacteristics
AT keli pilotbehaviorrecognitionbasedonmultimodalityfusiontechnologyusingphysiologicalcharacteristics
AT shaofanwang pilotbehaviorrecognitionbasedonmultimodalityfusiontechnologyusingphysiologicalcharacteristics
AT xiaodanchen pilotbehaviorrecognitionbasedonmultimodalityfusiontechnologyusingphysiologicalcharacteristics
AT dongshengwen pilotbehaviorrecognitionbasedonmultimodalityfusiontechnologyusingphysiologicalcharacteristics