PoRF: Pose residual field for accurate neural surface reconstruction

Neural surface reconstruction is sensitive to the camera pose noise, even if state-of-the-art pose estimators like COLMAP or ARKit are used. More importantly, existing Pose-NeRF joint optimisation methods have struggled to improve pose accuracy in challenging real-world scenarios. To overcome the ch...

Full description

Bibliographic Details
Main Authors: Bian, JW, Bian, W, Prisacariu, V, Torr, P
Format: Internet publication
Language:English
Published: 2023
_version_ 1826311841944436736
author Bian, JW
Bian, W
Prisacariu, V
Torr, P
author_facet Bian, JW
Bian, W
Prisacariu, V
Torr, P
author_sort Bian, JW
collection OXFORD
description Neural surface reconstruction is sensitive to the camera pose noise, even if state-of-the-art pose estimators like COLMAP or ARKit are used. More importantly, existing Pose-NeRF joint optimisation methods have struggled to improve pose accuracy in challenging real-world scenarios. To overcome the challenges, we introduce the pose residual field (\textbf{PoRF}), a novel implicit representation that uses an MLP for regressing pose updates. This is more robust than the conventional pose parameter optimisation due to parameter sharing that leverages global information over the entire sequence. Furthermore, we propose an epipolar geometry loss to enhance the supervision that leverages the correspondences exported from COLMAP results without the extra computational overhead. Our method yields promising results. On the DTU dataset, we reduce the rotation error by 78\% for COLMAP poses, leading to the decreased reconstruction Chamfer distance from 3.48mm to 0.85mm. On the MobileBrick dataset that contains casually captured unbounded 360-degree videos, our method refines ARKit poses and improves the reconstruction F1 score from 69.18 to 75.67, outperforming that with the dataset provided ground-truth pose (75.14). These achievements demonstrate the efficacy of our approach in refining camera poses and improving the accuracy of neural surface reconstruction in real-world scenarios.
first_indexed 2024-03-07T08:17:14Z
format Internet publication
id oxford-uuid:74699b5c-47a5-4147-9c70-9caa56b3262e
institution University of Oxford
language English
last_indexed 2024-03-07T08:17:14Z
publishDate 2023
record_format dspace
spelling oxford-uuid:74699b5c-47a5-4147-9c70-9caa56b3262e2024-01-12T11:23:57ZPoRF: Pose residual field for accurate neural surface reconstructionInternet publicationhttp://purl.org/coar/resource_type/c_7ad9uuid:74699b5c-47a5-4147-9c70-9caa56b3262eEnglishSymplectic Elements2023Bian, JWBian, WPrisacariu, VTorr, PNeural surface reconstruction is sensitive to the camera pose noise, even if state-of-the-art pose estimators like COLMAP or ARKit are used. More importantly, existing Pose-NeRF joint optimisation methods have struggled to improve pose accuracy in challenging real-world scenarios. To overcome the challenges, we introduce the pose residual field (\textbf{PoRF}), a novel implicit representation that uses an MLP for regressing pose updates. This is more robust than the conventional pose parameter optimisation due to parameter sharing that leverages global information over the entire sequence. Furthermore, we propose an epipolar geometry loss to enhance the supervision that leverages the correspondences exported from COLMAP results without the extra computational overhead. Our method yields promising results. On the DTU dataset, we reduce the rotation error by 78\% for COLMAP poses, leading to the decreased reconstruction Chamfer distance from 3.48mm to 0.85mm. On the MobileBrick dataset that contains casually captured unbounded 360-degree videos, our method refines ARKit poses and improves the reconstruction F1 score from 69.18 to 75.67, outperforming that with the dataset provided ground-truth pose (75.14). These achievements demonstrate the efficacy of our approach in refining camera poses and improving the accuracy of neural surface reconstruction in real-world scenarios.
spellingShingle Bian, JW
Bian, W
Prisacariu, V
Torr, P
PoRF: Pose residual field for accurate neural surface reconstruction
title PoRF: Pose residual field for accurate neural surface reconstruction
title_full PoRF: Pose residual field for accurate neural surface reconstruction
title_fullStr PoRF: Pose residual field for accurate neural surface reconstruction
title_full_unstemmed PoRF: Pose residual field for accurate neural surface reconstruction
title_short PoRF: Pose residual field for accurate neural surface reconstruction
title_sort porf pose residual field for accurate neural surface reconstruction
work_keys_str_mv AT bianjw porfposeresidualfieldforaccurateneuralsurfacereconstruction
AT bianw porfposeresidualfieldforaccurateneuralsurfacereconstruction
AT prisacariuv porfposeresidualfieldforaccurateneuralsurfacereconstruction
AT torrp porfposeresidualfieldforaccurateneuralsurfacereconstruction