Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models
Image representations in the form of neural activations derived from intermediate layers of deep neural networks are the state-of-the-art descriptors for instance based retrieval. However, the problem that persists consists of how to retrieve identical images as the most relevant ones from a large i...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9344701/ |
_version_ | 1818418362246496256 |
---|---|
author | Surajit Saikia Laura Fernandez-Robles Eduardo Fidalgo Fernandez Enrique Alegre |
author_facet | Surajit Saikia Laura Fernandez-Robles Eduardo Fidalgo Fernandez Enrique Alegre |
author_sort | Surajit Saikia |
collection | DOAJ |
description | Image representations in the form of neural activations derived from intermediate layers of deep neural networks are the state-of-the-art descriptors for instance based retrieval. However, the problem that persists consists of how to retrieve identical images as the most relevant ones from a large image or video corpus. In this work, we introduce colour neural descriptors that are made of convolutional neural networks (CNN) features obtained by combining different colour spaces and colour channels. In contrast to previous works, which rely on fine-tuning pre-trained networks, we compute the proposed descriptors based on the activations generated from a pretrained VGG-16 network without fine-tuning. Besides, we take advantage of an object detector to optimize our proposed instance retrieval architecture to generate features at both local and global scales. In addition, we introduce a stride based query expansion technique to retrieve objects from multi-view datasets. Finally, we experimentally proved that the proposed colour neural descriptors, obtain state-of-the-art results in Paris 6K, Revisiting-Paris 6k, INSTRE-M and COIL-100 datasets, with mAPs of 81.70, 82.02, 78.8 and 97.9, respectively. |
first_indexed | 2024-12-14T12:21:28Z |
format | Article |
id | doaj.art-07cf6bf26a144f59805ecd85dd0833e8 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-14T12:21:28Z |
publishDate | 2021-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-07cf6bf26a144f59805ecd85dd0833e82022-12-21T23:01:27ZengIEEEIEEE Access2169-35362021-01-019232182323410.1109/ACCESS.2021.30563309344701Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour ModelsSurajit Saikia0https://orcid.org/0000-0001-7757-1547Laura Fernandez-Robles1Eduardo Fidalgo Fernandez2https://orcid.org/0000-0003-1202-5232Enrique Alegre3Department of Electrical, Systems and Automation, University of León, León, SpainDepartment of Electrical, Systems and Automation, University of León, León, SpainDepartment of Electrical, Systems and Automation, University of León, León, SpainDepartment of Electrical, Systems and Automation, University of León, León, SpainImage representations in the form of neural activations derived from intermediate layers of deep neural networks are the state-of-the-art descriptors for instance based retrieval. However, the problem that persists consists of how to retrieve identical images as the most relevant ones from a large image or video corpus. In this work, we introduce colour neural descriptors that are made of convolutional neural networks (CNN) features obtained by combining different colour spaces and colour channels. In contrast to previous works, which rely on fine-tuning pre-trained networks, we compute the proposed descriptors based on the activations generated from a pretrained VGG-16 network without fine-tuning. Besides, we take advantage of an object detector to optimize our proposed instance retrieval architecture to generate features at both local and global scales. In addition, we introduce a stride based query expansion technique to retrieve objects from multi-view datasets. Finally, we experimentally proved that the proposed colour neural descriptors, obtain state-of-the-art results in Paris 6K, Revisiting-Paris 6k, INSTRE-M and COIL-100 datasets, with mAPs of 81.70, 82.02, 78.8 and 97.9, respectively.https://ieeexplore.ieee.org/document/9344701/Colour neural descriptorsCNNimage retrievalimage representation |
spellingShingle | Surajit Saikia Laura Fernandez-Robles Eduardo Fidalgo Fernandez Enrique Alegre Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models IEEE Access Colour neural descriptors CNN image retrieval image representation |
title | Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models |
title_full | Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models |
title_fullStr | Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models |
title_full_unstemmed | Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models |
title_short | Colour Neural Descriptors for Instance Retrieval Using CNN Features and Colour Models |
title_sort | colour neural descriptors for instance retrieval using cnn features and colour models |
topic | Colour neural descriptors CNN image retrieval image representation |
url | https://ieeexplore.ieee.org/document/9344701/ |
work_keys_str_mv | AT surajitsaikia colourneuraldescriptorsforinstanceretrievalusingcnnfeaturesandcolourmodels AT laurafernandezrobles colourneuraldescriptorsforinstanceretrievalusingcnnfeaturesandcolourmodels AT eduardofidalgofernandez colourneuraldescriptorsforinstanceretrievalusingcnnfeaturesandcolourmodels AT enriquealegre colourneuraldescriptorsforinstanceretrievalusingcnnfeaturesandcolourmodels |