Two-Branch Feature Interaction Fusion Method Based on Generative Adversarial Network

This study proposes a fusion method of infrared and visible images based on feature interaction. Existing fusion methods can be classified into two categories based on a single-branch network and a two-branch network. Generative adversarial networks are widely used in single-branch-based fusion meth...

Full description

Bibliographic Details
Main Authors: Rong Chang, Junpeng Dang, Nanchuan Zhang, Shan Zhao, Shijin Hu, Lin Xing, Haicheng Bai, Chengjiang Zhou, Yang Yang
Format: Article
Language:English
Published: MDPI AG 2023-08-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/12/16/3442
Description
Summary:This study proposes a fusion method of infrared and visible images based on feature interaction. Existing fusion methods can be classified into two categories based on a single-branch network and a two-branch network. Generative adversarial networks are widely used in single-branch-based fusion methods, which ignore the difference in feature extraction caused by different input images. Most two-branch-based fusion methods use convolutional neural networks, which do not take into account the inverse promotion of fusion results and lack the interaction between different input features. To remedy the shortcomings of these fusion methods and better utilize the feature from source images, this study proposes a two-branch feature interactions method based on a generative adversarial network for visible and infrared image fusion. In the generator part, a two-branch feature interaction approach was designed to extract features from different inputs and realize feature interaction through the network connection of different branches. In the discriminator part, a double-classification discriminator was used for visible images and infrared images. Extensive comparison experiments with state-of-the-art methods have demonstrated the advantages of this proposed generative adversarial network based on two-branch feature interaction, which can enhance the texture details of objects in fusion results and reduce the interference of noise information from source inputs. In addition, the above advantages were also confirmed in generalization experiments of object detection.
ISSN:2079-9292