Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images

Ground-to-aerial feature matching bridges information from cross-view images, which enables optimized urban applications, e.g., pixel-level geolocating and complete urban 3-D reconstruction. However, urban ground and aerial images typically suffer from drastic changes in viewpoint, scale, and illumi...

Full description

Bibliographic Details
Main Authors: Xianwei Zheng, Hongjie Li, Hanjiang Xiong, Xiao Xie
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9387538/
_version_ 1818620540448931840
author Xianwei Zheng
Hongjie Li
Hanjiang Xiong
Xiao Xie
author_facet Xianwei Zheng
Hongjie Li
Hanjiang Xiong
Xiao Xie
author_sort Xianwei Zheng
collection DOAJ
description Ground-to-aerial feature matching bridges information from cross-view images, which enables optimized urban applications, e.g., pixel-level geolocating and complete urban 3-D reconstruction. However, urban ground and aerial images typically suffer from drastic changes in viewpoint, scale, and illumination, together with repetitive patterns. Thus, direct matching of local features between ground and aerial images is particularly difficult because of the low similarity of local descriptors and high ambiguity in true–false match discrimination. For this challenging task, we propose a novel lattice-point mutually guided matching (LPMG) method in this article. We specifically address two key issues: 1) reducing descriptor variance and 2) enhancing true–false match discriminability. The former is solved by recovering the geometry and appearance of the underlying image region in 3-D through automatic view rectification on ground and aerial images. The latter is circumvented by replacing the conventional mismatch removal with an LPMG strategy. In this strategy, the topology structure of repeated façade elements (i.e., lattice), and the high reliable point matching seeds, are first extracted from the rectified ground and aerial images. Then, the point matching seeds guide the self-similar lattice tiles from two views to be precisely aligned, thereby estimating an accurate transformation model from lattice tile correspondences. Finally, the estimated model powerfully supervises the differentiation of true and false matches from the entire putative match set. Extensive experiments conducted on several datasets show that our method can obtain a considerable number of nearly pure correct matches from urban ground and aerial images, significantly outperforming those existing methods.
first_indexed 2024-12-16T17:55:00Z
format Article
id doaj.art-57836b6831b6414ba53f976b3b567b73
institution Directory Open Access Journal
issn 2151-1535
language English
last_indexed 2024-12-16T17:55:00Z
publishDate 2021-01-01
publisher IEEE
record_format Article
series IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
spelling doaj.art-57836b6831b6414ba53f976b3b567b732022-12-21T22:22:12ZengIEEEIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing2151-15352021-01-01144737475210.1109/JSTARS.2021.30692229387538Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene ImagesXianwei Zheng0https://orcid.org/0000-0001-9783-3030Hongjie Li1Hanjiang Xiong2Xiao Xie3State Key Lab. LIESMARS, Wuhan University, Wuhan, ChinaState Key Lab. LIESMARS, Wuhan University, Wuhan, ChinaState Key Lab. LIESMARS, Wuhan University, Wuhan, ChinaSchool of Geodesy, and Geomatics, Wuhan University, Wuhan, ChinaGround-to-aerial feature matching bridges information from cross-view images, which enables optimized urban applications, e.g., pixel-level geolocating and complete urban 3-D reconstruction. However, urban ground and aerial images typically suffer from drastic changes in viewpoint, scale, and illumination, together with repetitive patterns. Thus, direct matching of local features between ground and aerial images is particularly difficult because of the low similarity of local descriptors and high ambiguity in true–false match discrimination. For this challenging task, we propose a novel lattice-point mutually guided matching (LPMG) method in this article. We specifically address two key issues: 1) reducing descriptor variance and 2) enhancing true–false match discriminability. The former is solved by recovering the geometry and appearance of the underlying image region in 3-D through automatic view rectification on ground and aerial images. The latter is circumvented by replacing the conventional mismatch removal with an LPMG strategy. In this strategy, the topology structure of repeated façade elements (i.e., lattice), and the high reliable point matching seeds, are first extracted from the rectified ground and aerial images. Then, the point matching seeds guide the self-similar lattice tiles from two views to be precisely aligned, thereby estimating an accurate transformation model from lattice tile correspondences. Finally, the estimated model powerfully supervises the differentiation of true and false matches from the entire putative match set. Extensive experiments conducted on several datasets show that our method can obtain a considerable number of nearly pure correct matches from urban ground and aerial images, significantly outperforming those existing methods.https://ieeexplore.ieee.org/document/9387538/Aerial oblique imageryfeature matchingground imageryground-to-aerial image matchingrepetitive pattern
spellingShingle Xianwei Zheng
Hongjie Li
Hanjiang Xiong
Xiao Xie
Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Aerial oblique imagery
feature matching
ground imagery
ground-to-aerial image matching
repetitive pattern
title Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
title_full Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
title_fullStr Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
title_full_unstemmed Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
title_short Lattice-Point Mutually Guided Ground-to-Aerial Feature Matching for Urban Scene Images
title_sort lattice point mutually guided ground to aerial feature matching for urban scene images
topic Aerial oblique imagery
feature matching
ground imagery
ground-to-aerial image matching
repetitive pattern
url https://ieeexplore.ieee.org/document/9387538/
work_keys_str_mv AT xianweizheng latticepointmutuallyguidedgroundtoaerialfeaturematchingforurbansceneimages
AT hongjieli latticepointmutuallyguidedgroundtoaerialfeaturematchingforurbansceneimages
AT hanjiangxiong latticepointmutuallyguidedgroundtoaerialfeaturematchingforurbansceneimages
AT xiaoxie latticepointmutuallyguidedgroundtoaerialfeaturematchingforurbansceneimages