Parameterization-driven neural surface reconstruction for object-oriented editing in neural rendering
The advancements in neural rendering have increased the need for techniques that enable intuitive editing of 3D objects represented as neural implicit surfaces. This paper introduces a novel neural algorithm for parameterizing neural implicit surfaces to simple parametric domains like spheres and...
Main Authors: | , , , , , , |
---|---|
Other Authors: | |
Format: | Conference Paper |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/180248 http://arxiv.org/abs/2310.05524v3 |
Summary: | The advancements in neural rendering have increased the need for techniques
that enable intuitive editing of 3D objects represented as neural implicit
surfaces. This paper introduces a novel neural algorithm for parameterizing
neural implicit surfaces to simple parametric domains like spheres and
polycubes. Our method allows users to specify the number of cubes in the
parametric domain, learning a configuration that closely resembles the target
3D object's geometry. It computes bi-directional deformation between the object
and the domain using a forward mapping from the object's zero level set and an
inverse deformation for backward mapping. We ensure nearly bijective mapping
with a cycle loss and optimize deformation smoothness. The parameterization
quality, assessed by angle and area distortions, is guaranteed using a
Laplacian regularizer and an optimized learned parametric domain. Our framework
integrates with existing neural rendering pipelines, using multi-view images of
a single object or multiple objects of similar geometries to reconstruct 3D
geometry and compute texture maps automatically, eliminating the need for any
prior information. We demonstrate the method's effectiveness on images of human
heads and man-made objects. |
---|