Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment

Drones have been used in a variety of scenarios, such as atmospheric monitoring, fire rescue, agricultural irrigation, etc., in which accurate environmental perception is of crucial importance for both decision making and control. Among drone sensors, the RGB camera is indispensable for capturing ri...

Full description

Bibliographic Details
Main Authors: Yuqi Han, Xiaohang Yu, Heng Luan, Jinli Suo
Format: Article
Language:English
Published: MDPI AG 2024-01-01
Series:Drones
Subjects:
Online Access:https://www.mdpi.com/2504-446X/8/1/22
_version_ 1797344316291547136
author Yuqi Han
Xiaohang Yu
Heng Luan
Jinli Suo
author_facet Yuqi Han
Xiaohang Yu
Heng Luan
Jinli Suo
author_sort Yuqi Han
collection DOAJ
description Drones have been used in a variety of scenarios, such as atmospheric monitoring, fire rescue, agricultural irrigation, etc., in which accurate environmental perception is of crucial importance for both decision making and control. Among drone sensors, the RGB camera is indispensable for capturing rich visual information for vehicle navigation but encounters a grand challenge in high-dynamic-range scenes, which frequently occur in real applications. Specifically, the recorded frames suffer from underexposure and overexposure simultaneously and degenerate the successive vision tasks. To solve the problem, we take object tracking as an example and leverage the superior response of event cameras over a large intensity range to propose an event-assisted object tracking algorithm that can achieve reliable tracking under large intensity variations. Specifically, we propose to pursue feature matching from dense event signals and, based on this, to (i) design a U-Net-based image enhancement algorithm to balance RGB intensity with the help of neighboring frames in the time domain and then (ii) construct a dual-input tracking model to track the moving objects from intensity-balanced RGB video and event sequences. The proposed approach is comprehensively validated in both simulation and real experiments.
first_indexed 2024-03-08T11:00:37Z
format Article
id doaj.art-e43b0103eb844178bcb2f33619002e55
institution Directory Open Access Journal
issn 2504-446X
language English
last_indexed 2024-03-08T11:00:37Z
publishDate 2024-01-01
publisher MDPI AG
record_format Article
series Drones
spelling doaj.art-e43b0103eb844178bcb2f33619002e552024-01-26T16:05:56ZengMDPI AGDrones2504-446X2024-01-01812210.3390/drones8010022Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination EnvironmentYuqi Han0Xiaohang Yu1Heng Luan2Jinli Suo3Department of Automation, Tsinghua University, Beijing 100084, ChinaTsinghua-UC Berkeley Shenzhen Institute, Shenzhen 518071, ChinaResearch and Development Center, TravelSky Technology Ltd., Beijing 101318, ChinaDepartment of Automation, Tsinghua University, Beijing 100084, ChinaDrones have been used in a variety of scenarios, such as atmospheric monitoring, fire rescue, agricultural irrigation, etc., in which accurate environmental perception is of crucial importance for both decision making and control. Among drone sensors, the RGB camera is indispensable for capturing rich visual information for vehicle navigation but encounters a grand challenge in high-dynamic-range scenes, which frequently occur in real applications. Specifically, the recorded frames suffer from underexposure and overexposure simultaneously and degenerate the successive vision tasks. To solve the problem, we take object tracking as an example and leverage the superior response of event cameras over a large intensity range to propose an event-assisted object tracking algorithm that can achieve reliable tracking under large intensity variations. Specifically, we propose to pursue feature matching from dense event signals and, based on this, to (i) design a U-Net-based image enhancement algorithm to balance RGB intensity with the help of neighboring frames in the time domain and then (ii) construct a dual-input tracking model to track the moving objects from intensity-balanced RGB video and event sequences. The proposed approach is comprehensively validated in both simulation and real experiments.https://www.mdpi.com/2504-446X/8/1/22dronesharsh illuminationimage enhancementevent-assisted object trackingmulti-sensor fusion
spellingShingle Yuqi Han
Xiaohang Yu
Heng Luan
Jinli Suo
Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
Drones
drones
harsh illumination
image enhancement
event-assisted object tracking
multi-sensor fusion
title Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
title_full Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
title_fullStr Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
title_full_unstemmed Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
title_short Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment
title_sort event assisted object tracking on high speed drones in harsh illumination environment
topic drones
harsh illumination
image enhancement
event-assisted object tracking
multi-sensor fusion
url https://www.mdpi.com/2504-446X/8/1/22
work_keys_str_mv AT yuqihan eventassistedobjecttrackingonhighspeeddronesinharshilluminationenvironment
AT xiaohangyu eventassistedobjecttrackingonhighspeeddronesinharshilluminationenvironment
AT hengluan eventassistedobjecttrackingonhighspeeddronesinharshilluminationenvironment
AT jinlisuo eventassistedobjecttrackingonhighspeeddronesinharshilluminationenvironment