Animate your avatar: learning conditional human motion prior
This research project focuses on the concept of virtual humans and aims to enable natural language control of 3D avatars, allowing them to perform human-like movements that are coherent with their surrounding environment. To achieve this goal, the project proposes to learn a "conditional"...
Main Author: | Singh, Ananya |
---|---|
Other Authors: | Liu Ziwei |
Format: | Final Year Project (FYP) |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/165971 |
Similar Items
-
Global and local motion priors and their applications
by: Yuen, Jenny, S.M. Massachusetts Institute of Technology
Published: (2009) -
Human animation from motion recognition, analysis and optimization
by: Zhao, Jianhui.
Published: (2008) -
Development of a hybrid teleconference application with personalized avatars
by: Ng, Zheng Hao
Published: (2021) -
Human motion tracking using deep learning
by: Peok, Qing Xiang
Published: (2020) -
Facial expression retargeting from human to avatar made easy
by: Zhang, Juyong, et al.
Published: (2022)