Depth Estimation From a Light Field Image Pair With a Generative Model

In this paper, we propose a novel method to estimate the disparity maps from a light field image pair captured by a pair of light field cameras. Our method integrates two types of critical depth cues, which are separately inferred from the epipolar plane images and binocular stereo vision into a glo...

Full description

Bibliographic Details
Main Authors: Tao Yan, Fan Zhang, Yiming Mao, Hongbin Yu, Xiaohua Qian, Rynson W. H. Lau
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8620996/
_version_ 1818557129365127168
author Tao Yan
Fan Zhang
Yiming Mao
Hongbin Yu
Xiaohua Qian
Rynson W. H. Lau
author_facet Tao Yan
Fan Zhang
Yiming Mao
Hongbin Yu
Xiaohua Qian
Rynson W. H. Lau
author_sort Tao Yan
collection DOAJ
description In this paper, we propose a novel method to estimate the disparity maps from a light field image pair captured by a pair of light field cameras. Our method integrates two types of critical depth cues, which are separately inferred from the epipolar plane images and binocular stereo vision into a global solution. At the same time, in order to produce highly accurate disparity maps, we adopt a generative model, which can estimate a light field image only with the central subaperture view and corresponding hypothesized disparity map. The objective function of our method is formulated to minimize two energy terms/differences. One is the difference between the two types of previously extracted disparity maps and the target disparity maps, directly optimized in the gray-scale disparity space. The other indicates the difference between the estimated light field images and the input light field images, optimized in the RGB color space. Comprehensive experiments conducted on real and virtual scene light field image pairs demonstrate the effectiveness of our method.
first_indexed 2024-12-13T23:55:56Z
format Article
id doaj.art-b118fa90e96146d581463e6a0c387674
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-13T23:55:56Z
publishDate 2019-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-b118fa90e96146d581463e6a0c3876742022-12-21T23:26:33ZengIEEEIEEE Access2169-35362019-01-017127681277810.1109/ACCESS.2019.28933548620996Depth Estimation From a Light Field Image Pair With a Generative ModelTao Yan0https://orcid.org/0000-0002-9162-8551Fan Zhang1Yiming Mao2Hongbin Yu3Xiaohua Qian4Rynson W. H. Lau5Jiangsu Key Laboratory of Media Design and Software Technology, School of Digital Media, Jiangnan University, Jiangsu, ChinaJiangsu Key Laboratory of Media Design and Software Technology, School of Digital Media, Jiangnan University, Jiangsu, ChinaJiangsu Key Laboratory of Media Design and Software Technology, School of Digital Media, Jiangnan University, Jiangsu, ChinaJiangsu Key Laboratory of Media Design and Software Technology, School of Digital Media, Jiangnan University, Jiangsu, ChinaInstitute for Medical Imaging Technology, School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, ChinaDepartment of Computer Science, City University of Hong Kong, Hong KongIn this paper, we propose a novel method to estimate the disparity maps from a light field image pair captured by a pair of light field cameras. Our method integrates two types of critical depth cues, which are separately inferred from the epipolar plane images and binocular stereo vision into a global solution. At the same time, in order to produce highly accurate disparity maps, we adopt a generative model, which can estimate a light field image only with the central subaperture view and corresponding hypothesized disparity map. The objective function of our method is formulated to minimize two energy terms/differences. One is the difference between the two types of previously extracted disparity maps and the target disparity maps, directly optimized in the gray-scale disparity space. The other indicates the difference between the estimated light field images and the input light field images, optimized in the RGB color space. Comprehensive experiments conducted on real and virtual scene light field image pairs demonstrate the effectiveness of our method.https://ieeexplore.ieee.org/document/8620996/Light fielddepth estimationdisparity mapepipolar plane imagestereo matchinggenerative model
spellingShingle Tao Yan
Fan Zhang
Yiming Mao
Hongbin Yu
Xiaohua Qian
Rynson W. H. Lau
Depth Estimation From a Light Field Image Pair With a Generative Model
IEEE Access
Light field
depth estimation
disparity map
epipolar plane image
stereo matching
generative model
title Depth Estimation From a Light Field Image Pair With a Generative Model
title_full Depth Estimation From a Light Field Image Pair With a Generative Model
title_fullStr Depth Estimation From a Light Field Image Pair With a Generative Model
title_full_unstemmed Depth Estimation From a Light Field Image Pair With a Generative Model
title_short Depth Estimation From a Light Field Image Pair With a Generative Model
title_sort depth estimation from a light field image pair with a generative model
topic Light field
depth estimation
disparity map
epipolar plane image
stereo matching
generative model
url https://ieeexplore.ieee.org/document/8620996/
work_keys_str_mv AT taoyan depthestimationfromalightfieldimagepairwithagenerativemodel
AT fanzhang depthestimationfromalightfieldimagepairwithagenerativemodel
AT yimingmao depthestimationfromalightfieldimagepairwithagenerativemodel
AT hongbinyu depthestimationfromalightfieldimagepairwithagenerativemodel
AT xiaohuaqian depthestimationfromalightfieldimagepairwithagenerativemodel
AT rynsonwhlau depthestimationfromalightfieldimagepairwithagenerativemodel