Visual Information Fusion through Bayesian Inference for Adaptive Probability-Oriented Feature Matching
This work presents a visual information fusion approach for robust probability-oriented feature matching. It is sustained by omnidirectional imaging, and it is tested in a visual localization framework, in mobile robotics. General visual localization methods have been extensively studied and optimiz...
Main Authors: | David Valiente, Luis Payá, Luis M. Jiménez, Jose M. Sebastián, Óscar Reinoso |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-06-01
|
Series: | Sensors |
Subjects: | |
Online Access: | http://www.mdpi.com/1424-8220/18/7/2041 |
Similar Items
-
Self-Localization of Mobile Robots Using a Single Catadioptric Camera with Line Feature Extraction
by: Huei-Yung Lin, et al.
Published: (2021-07-01) -
Improved Omnidirectional Odometry for a View-Based Mapping Approach
by: David Valiente, et al.
Published: (2017-02-01) -
Precision Calibration of Omnidirectional Camera Using a Statistical Approach
by: Vasilii P. Lazarenko, et al.
Published: (2022-11-01) -
METHOD FOR CREATION OF SPHERICAL PANORAMAS FROM IMAGES OBTAINED BY OMNIDIRECTIONAL OPTOELECTRONIC SYSTEMS
by: V. P. Lazarenko, et al.
Published: (2016-01-01) -
A 3D Estimation Method Using an Omnidirectional Camera and a Spherical Mirror
by: Yuya Hiruta, et al.
Published: (2023-07-01)