Summary: | We propose an intelligent human−unmanned aerial vehicle (UAV) interaction system, in which, instead of using the conventional remote controller, the UAV flight actions are controlled by a deep learning-based action−gesture joint detection system. The Resnet-based scene-understanding algorithm is introduced into the proposed system to enable the UAV to adjust its flight strategy automatically, according to the flying conditions. Meanwhile, both the deep learning-based action detection and multi-feature cascade gesture recognition methods are employed by a cross-validation process to create the corresponding flight action. The effectiveness and efficiency of the proposed system are confirmed by its application to controlling the flight action of a real flying UAV for more than 3 h.
|