Multi-Scale Attention and Structural Relation Graph for Local Feature Matching

Building a dense correspondence between two images is a fundamental vision problem. Most existing methods use local features, but global features cannot be ignored. Local features are often not enough to disambiguate similar regions without global features. Computing relevant features between images...

Full description

Bibliographic Details
Main Authors: Xiaohu Nan, Lei Ding
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9921215/
_version_ 1798030522486620160
author Xiaohu Nan
Lei Ding
author_facet Xiaohu Nan
Lei Ding
author_sort Xiaohu Nan
collection DOAJ
description Building a dense correspondence between two images is a fundamental vision problem. Most existing methods use local features, but global features cannot be ignored. Local features are often not enough to disambiguate similar regions without global features. Computing relevant features between images requires structural relationship and the importance of local features. For that, We propose novel multi-scale attention and structural relation graph (MASRG) for local feature matching. The MASRG adopts an overall architecture that first builds coarse-level matches on a coarse feature map and then refines fine matches on a fine-level feature map. We propose a structural relation graph module and a multi-scale attention module. We introduce global context information into the overall architecture. Using global information to separately assist in learning the structural information between local descriptors, the features of different receptive fields, and the importance of modeling single local information, a limited number of possible matches can be obtained with high confidence. Finally, the matching relationship is predicted. In this way, the network significantly improves the matching reliability and localization accuracy. Our proposed method has 5.6%, 6.7%, and 6.3% performance increases over the baseline method(See I) under different conditions in the HPatches. Extensive experiments on three large-scale datasets (i.e., HPatches, InLoc, and Aachen Day-Night v1.1) demonstrate that our proposed MASRG method is superior to state-of-the-art local feature matching approaches.
first_indexed 2024-04-11T19:42:38Z
format Article
id doaj.art-4a72e6f34aba4e0284fe7f86f23dbe44
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-11T19:42:38Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-4a72e6f34aba4e0284fe7f86f23dbe442022-12-22T04:06:40ZengIEEEIEEE Access2169-35362022-01-011011060311061510.1109/ACCESS.2022.32151689921215Multi-Scale Attention and Structural Relation Graph for Local Feature MatchingXiaohu Nan0https://orcid.org/0000-0003-4201-2854Lei Ding1Shanghai Institute of Technical Physics, Chinese Academy of Sciences, Shanghai, ChinaShanghai Institute of Technical Physics, Chinese Academy of Sciences, Shanghai, ChinaBuilding a dense correspondence between two images is a fundamental vision problem. Most existing methods use local features, but global features cannot be ignored. Local features are often not enough to disambiguate similar regions without global features. Computing relevant features between images requires structural relationship and the importance of local features. For that, We propose novel multi-scale attention and structural relation graph (MASRG) for local feature matching. The MASRG adopts an overall architecture that first builds coarse-level matches on a coarse feature map and then refines fine matches on a fine-level feature map. We propose a structural relation graph module and a multi-scale attention module. We introduce global context information into the overall architecture. Using global information to separately assist in learning the structural information between local descriptors, the features of different receptive fields, and the importance of modeling single local information, a limited number of possible matches can be obtained with high confidence. Finally, the matching relationship is predicted. In this way, the network significantly improves the matching reliability and localization accuracy. Our proposed method has 5.6%, 6.7%, and 6.3% performance increases over the baseline method(See I) under different conditions in the HPatches. Extensive experiments on three large-scale datasets (i.e., HPatches, InLoc, and Aachen Day-Night v1.1) demonstrate that our proposed MASRG method is superior to state-of-the-art local feature matching approaches.https://ieeexplore.ieee.org/document/9921215/Image matchingattentiongraph convolution networkdeep learning
spellingShingle Xiaohu Nan
Lei Ding
Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
IEEE Access
Image matching
attention
graph convolution network
deep learning
title Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
title_full Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
title_fullStr Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
title_full_unstemmed Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
title_short Multi-Scale Attention and Structural Relation Graph for Local Feature Matching
title_sort multi scale attention and structural relation graph for local feature matching
topic Image matching
attention
graph convolution network
deep learning
url https://ieeexplore.ieee.org/document/9921215/
work_keys_str_mv AT xiaohunan multiscaleattentionandstructuralrelationgraphforlocalfeaturematching
AT leiding multiscaleattentionandstructuralrelationgraphforlocalfeaturematching