Multi-Path Minimum Spanning Tree and Superpixel Based Cost Aggregation for Stereo Matching

Cost aggregation is a key step in stereo matching algorithms. Despite more than a decade of development, most algorithms still encounter challenges such as high error rates in low-texture regions and blurred edges. To improve matching accuracy, we propose a novel cost aggregation method based on mul...

Full description

Bibliographic Details
Main Author: Longhao Sun
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10298207/
Description
Summary:Cost aggregation is a key step in stereo matching algorithms. Despite more than a decade of development, most algorithms still encounter challenges such as high error rates in low-texture regions and blurred edges. To improve matching accuracy, we propose a novel cost aggregation method based on multi-path minimum spanning tree (mPMST) and superpixel in this paper. The mPMST offers more optional paths for cost aggregation than the original MST by treating the reference image as an eight-connected graph. To improve both accuracy and computational efficiency, we innovatively run the mPMST for cost aggregation at the inside-superpixel level and superpixel level, which can obtain high accuracy in high-texture regions and low-texture regions respectively. In order to effectively fuse the two-level aggregated costs, we propose a novel adaptive weight based on calculating image entropy for each superpixel. This method can distinguish between regions with high and low texture and quantify texture complexity. Additionally, a novel disparity map refinement method is proposed to improve the quality of disparity maps using the novel cost aggregation structure proposed. In the experimental studies, we test our method on Middlebury and KITTI benchmarks. Average error rates of 5.94% for Middlebury 2006 and 24.51% for KITTI 2015 are achieved. Our experiments show improvement in accuracy compared with other state-of-the-art approaches.
ISSN:2169-3536