Unsupervised 3D pose transfer with cross consistency and dual reconstruction

The goal of 3D pose transfer is to transfer the pose from the source mesh to the target mesh while preserving the identity information (e.g., face, body shape) of the target mesh. Deep learning-based methods improved the efficiency and performance of 3D pose transfer. However, most of them are train...

Full description

Bibliographic Details
Main Authors: Song, Chaoyue, Wei, Jiacheng, Li, Ruibo, Liu, Fayao, Lin, Guosheng
Other Authors: School of Computer Science and Engineering
Format: Journal Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172191
_version_ 1811681868368576512
author Song, Chaoyue
Wei, Jiacheng
Li, Ruibo
Liu, Fayao
Lin, Guosheng
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Song, Chaoyue
Wei, Jiacheng
Li, Ruibo
Liu, Fayao
Lin, Guosheng
author_sort Song, Chaoyue
collection NTU
description The goal of 3D pose transfer is to transfer the pose from the source mesh to the target mesh while preserving the identity information (e.g., face, body shape) of the target mesh. Deep learning-based methods improved the efficiency and performance of 3D pose transfer. However, most of them are trained under the supervision of the ground truth, whose availability is limited in real-world scenarios. In this work, we present X-DualNet, a simple yet effective approach that enables unsupervised 3D pose transfer. In X-DualNet, we introduce a generator G which contains correspondence learning and pose transfer modules to achieve 3D pose transfer. We learn the shape correspondence by solving an optimal transport problem without any key point annotations and generate high-quality meshes with our elastic instance normalization (ElaIN) in the pose transfer module. With G as the basic component, we propose a cross consistency learning scheme and a dual reconstruction objective to learn the pose transfer without supervision. Besides that, we also adopt an as-rigid-as-possible deformer in the training process to fine-tune the body shape of the generated results. Extensive experiments on human and animal data demonstrate that our framework can successfully achieve comparable performance as the state-of-the-art supervised approaches.
first_indexed 2024-10-01T03:47:47Z
format Journal Article
id ntu-10356/172191
institution Nanyang Technological University
language English
last_indexed 2024-10-01T03:47:47Z
publishDate 2023
record_format dspace
spelling ntu-10356/1721912023-11-29T01:18:45Z Unsupervised 3D pose transfer with cross consistency and dual reconstruction Song, Chaoyue Wei, Jiacheng Li, Ruibo Liu, Fayao Lin, Guosheng School of Computer Science and Engineering School of Electrical and Electronic Engineering S-Lab Engineering::Computer science and engineering 3D Pose Transfer Conditional Normalization Layer The goal of 3D pose transfer is to transfer the pose from the source mesh to the target mesh while preserving the identity information (e.g., face, body shape) of the target mesh. Deep learning-based methods improved the efficiency and performance of 3D pose transfer. However, most of them are trained under the supervision of the ground truth, whose availability is limited in real-world scenarios. In this work, we present X-DualNet, a simple yet effective approach that enables unsupervised 3D pose transfer. In X-DualNet, we introduce a generator G which contains correspondence learning and pose transfer modules to achieve 3D pose transfer. We learn the shape correspondence by solving an optimal transport problem without any key point annotations and generate high-quality meshes with our elastic instance normalization (ElaIN) in the pose transfer module. With G as the basic component, we propose a cross consistency learning scheme and a dual reconstruction objective to learn the pose transfer without supervision. Besides that, we also adopt an as-rigid-as-possible deformer in the training process to fine-tune the body shape of the generated results. Extensive experiments on human and animal data demonstrate that our framework can successfully achieve comparable performance as the state-of-the-art supervised approaches. Agency for Science, Technology and Research (A*STAR) Ministry of Education (MOE) National Research Foundation (NRF) This research was supported in part by the RIE2020 Industry Alignment Fund – Industry Collaboration Projects (IAF-ICP) Funding Initiative, as well as cash and in-kind contribution from the industry partner(s). This research was also supported in part by the National Research Foundation, Singapore under its AI Singapore Programme AISG under Grant AISG-RP-2018-003, in part by the Ministry of Education, Singapore, under its Academic Research Fund Tier 2 under Grant MOE-T2EP20220-0007, and in part by Tier 1 under Grant RG95/20. This research was also supported in part by the Agency for Science, Technology and Research (A*STAR), Singapore under its MTC Young Individual Research Grant under Grant M21K3c0130. 2023-11-29T01:18:45Z 2023-11-29T01:18:45Z 2023 Journal Article Song, C., Wei, J., Li, R., Liu, F. & Lin, G. (2023). Unsupervised 3D pose transfer with cross consistency and dual reconstruction. IEEE Transactions On Pattern Analysis and Machine Intelligence, 45(8), 10488-10499. https://dx.doi.org/10.1109/TPAMI.2023.3259059 0162-8828 https://hdl.handle.net/10356/172191 10.1109/TPAMI.2023.3259059 37030769 2-s2.0-85151556937 8 45 10488 10499 en AISG-RP-2018-003 MOE-T2EP20220-0007 RG95/20 M21K3c0130 IEEE Transactions on Pattern Analysis and Machine Intelligence © 2023 IEEE. All rights reserved.
spellingShingle Engineering::Computer science and engineering
3D Pose Transfer
Conditional Normalization Layer
Song, Chaoyue
Wei, Jiacheng
Li, Ruibo
Liu, Fayao
Lin, Guosheng
Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title_full Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title_fullStr Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title_full_unstemmed Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title_short Unsupervised 3D pose transfer with cross consistency and dual reconstruction
title_sort unsupervised 3d pose transfer with cross consistency and dual reconstruction
topic Engineering::Computer science and engineering
3D Pose Transfer
Conditional Normalization Layer
url https://hdl.handle.net/10356/172191
work_keys_str_mv AT songchaoyue unsupervised3dposetransferwithcrossconsistencyanddualreconstruction
AT weijiacheng unsupervised3dposetransferwithcrossconsistencyanddualreconstruction
AT liruibo unsupervised3dposetransferwithcrossconsistencyanddualreconstruction
AT liufayao unsupervised3dposetransferwithcrossconsistencyanddualreconstruction
AT linguosheng unsupervised3dposetransferwithcrossconsistencyanddualreconstruction