Towards Efficient Local 3D Conditioning

Recently, Neural Implicit Representations (NIRs) have gained popularity for learning-based 3D shape representation. General representations, i.e. ones that share a decoder across a family of geometries have multiple advantages such as ability of generating previously unseen samples and smoothly inte...

Full description

Bibliographic Details
Main Authors: Zhang, Dingxi, Lukoianov, Artem
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: ACM|SIGGRAPH Asia 2023 Posters 2023
Online Access:https://hdl.handle.net/1721.1/153149
Description
Summary:Recently, Neural Implicit Representations (NIRs) have gained popularity for learning-based 3D shape representation. General representations, i.e. ones that share a decoder across a family of geometries have multiple advantages such as ability of generating previously unseen samples and smoothly interpolating between training examples. These representations, however, impose a trade-off between quality of reconstruction and memory footprint stored per sample. Globally conditioned NIRs suffer from a lack of quality in capturing intricate shape details, while densely conditioned NIRs demand excessive memory resources. In this work we suggest using a Neural Network to approximate a grid of latent codes, while sharing the decoder across the entire category. Our model achieves a significantly better reconstruction quality compared to globally conditioned methods, while using less memory per sample to store single geometry.