Simulating Real-World Human Activities with VirtualCity: A Large-Scale Embodied Environment for 2D, 3D, and Language-Driven Tasks

Embodied environments act as a tool that enables various control tasks to be learned. Within these simulators, having realistic rendering and physics ensures that the sim2real gap for tasks isn’t too large. Current embodied environments focus mainly on small-scale or low-level tasks, without the cap...

Full description

Bibliographic Details
Main Author: Ren, Jordan
Other Authors: Torralba, Antonio
Format: Thesis
Published: Massachusetts Institute of Technology 2023
Online Access:https://hdl.handle.net/1721.1/151411
Description
Summary:Embodied environments act as a tool that enables various control tasks to be learned. Within these simulators, having realistic rendering and physics ensures that the sim2real gap for tasks isn’t too large. Current embodied environments focus mainly on small-scale or low-level tasks, without the capability to learn large-scale diverse tasks, and often lack the realism for a small sim2real gap. To address the shortcomings of current simulators, we propose VirtualCity, a large-scale embodied environment that enables the learning of high-level planning tasks with photo-realistic rendering and realistic physics. To interact with VirtualCity, we provide a user-friendly Python API that allows the modification, control, and observation of the environment and its agents within. Building this realistic environment brings us closer to adapting models trained in simulation to solve real-world tasks.