Two‐way constraint network for RGB‐Infrared person re‐identification

Abstract RGB‐Infrared person re‐identification (RGB‐IR Re‐ID) is a task aiming to retrieve and match person images between RGB images and IR images. Since most surveillance cameras capture RGB images during the day and IR images at night, RGB‐IR Re‐ID is helpful when checking day and night surveilla...

Full description

Bibliographic Details
Main Authors: Haitang Zeng, Weipeng Hu, Dihu Chen, Haifeng Hu
Format: Article
Language:English
Published: Wiley 2021-08-01
Series:Electronics Letters
Subjects:
Online Access:https://doi.org/10.1049/ell2.12215
Description
Summary:Abstract RGB‐Infrared person re‐identification (RGB‐IR Re‐ID) is a task aiming to retrieve and match person images between RGB images and IR images. Since most surveillance cameras capture RGB images during the day and IR images at night, RGB‐IR Re‐ID is helpful when checking day and night surveillance for criminal investigations. Previous related work often only extracts sharable and identity‐related features in images for identification. Few researches specifically extract and make use of features that do not have the ability to distinguish identity, e.g. identity‐unrelated features derived from background and modality. In this Letter, we propose a novel and concise RGB‐IR Re‐ID network named two‐way constraint network (TWCN). Compared with traditional Re‐ID networks, TWCN not only extracts and utilises identity‐related features but also novelly makes full use of identity‐unrelated features to improve the accuracy of the experiment. TWCN uses a reverse‐triplet loss to extract identity‐unrelated features, and proposes an orthogonal constraint to remove identity‐unrelated information from identity‐related features, which improves the purity of identity‐related features. In addition, a correlation coefficient synergy and central clustering (CCSCC) loss is introduced into TWCN to extract identity‐related features effectively. Extensive experiments have been conducted to prove our method is effective.
ISSN:0013-5194
1350-911X