BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS

3D point clouds, acquired by state-of-the-art terrestrial laser scanning techniques (TLS), provide spatial information about accuracies up to several millimetres. Unfortunately, common TLS data has no spectral information about the covered scene. However, the matching of TLS data with images is im...

Full description

Bibliographic Details
Main Authors: R. Boerner, M. Kröhnert
Format: Article
Language:English
Published: Copernicus Publications 2016-06-01
Series:The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Online Access:https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLI-B5/771/2016/isprs-archives-XLI-B5-771-2016.pdf
_version_ 1819053121290108928
author R. Boerner
M. Kröhnert
author_facet R. Boerner
M. Kröhnert
author_sort R. Boerner
collection DOAJ
description 3D point clouds, acquired by state-of-the-art terrestrial laser scanning techniques (TLS), provide spatial information about accuracies up to several millimetres. Unfortunately, common TLS data has no spectral information about the covered scene. However, the matching of TLS data with images is important for monoplotting purposes and point cloud colouration. Well-established methods solve this issue by matching of close range images and point cloud data by fitting optical camera systems on top of laser scanners or rather using ground control points. <br><br> The approach addressed in this paper aims for the matching of 2D image and 3D point cloud data from a freely moving camera within an environment covered by a large 3D point cloud, e.g. a 3D city model. The key advantage of the free movement affects augmented reality applications or real time measurements. Therefore, a so-called real image, captured by a smartphone camera, has to be matched with a so-called synthetic image which consists of reverse projected 3D point cloud data to a synthetic projection centre whose exterior orientation parameters match the parameters of the image, assuming an ideal distortion free camera.
first_indexed 2024-12-21T12:30:41Z
format Article
id doaj.art-cbc8a8f391dc4a8c8fdb957a9d14fd1c
institution Directory Open Access Journal
issn 1682-1750
2194-9034
language English
last_indexed 2024-12-21T12:30:41Z
publishDate 2016-06-01
publisher Copernicus Publications
record_format Article
series The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
spelling doaj.art-cbc8a8f391dc4a8c8fdb957a9d14fd1c2022-12-21T19:04:02ZengCopernicus PublicationsThe International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences1682-17502194-90342016-06-01XLI-B577177710.5194/isprs-archives-XLI-B5-771-2016BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDSR. Boerner0M. Kröhnert1Institute of Photogrammetry and Remote Sensing, Technische Universität Dresden, GermanyInstitute of Photogrammetry and Remote Sensing, Technische Universität Dresden, Germany3D point clouds, acquired by state-of-the-art terrestrial laser scanning techniques (TLS), provide spatial information about accuracies up to several millimetres. Unfortunately, common TLS data has no spectral information about the covered scene. However, the matching of TLS data with images is important for monoplotting purposes and point cloud colouration. Well-established methods solve this issue by matching of close range images and point cloud data by fitting optical camera systems on top of laser scanners or rather using ground control points. <br><br> The approach addressed in this paper aims for the matching of 2D image and 3D point cloud data from a freely moving camera within an environment covered by a large 3D point cloud, e.g. a 3D city model. The key advantage of the free movement affects augmented reality applications or real time measurements. Therefore, a so-called real image, captured by a smartphone camera, has to be matched with a so-called synthetic image which consists of reverse projected 3D point cloud data to a synthetic projection centre whose exterior orientation parameters match the parameters of the image, assuming an ideal distortion free camera.https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLI-B5/771/2016/isprs-archives-XLI-B5-771-2016.pdf
spellingShingle R. Boerner
M. Kröhnert
BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
title BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
title_full BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
title_fullStr BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
title_full_unstemmed BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
title_short BRUTE FORCE MATCHING BETWEEN CAMERA SHOTS AND SYNTHETIC IMAGES FROM POINT CLOUDS
title_sort brute force matching between camera shots and synthetic images from point clouds
url https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLI-B5/771/2016/isprs-archives-XLI-B5-771-2016.pdf
work_keys_str_mv AT rboerner bruteforcematchingbetweencamerashotsandsyntheticimagesfrompointclouds
AT mkrohnert bruteforcematchingbetweencamerashotsandsyntheticimagesfrompointclouds