Virtual Scene Construction of Wetlands: A Case Study of Poyang Lake, China

Due to the complexity of wetland ecosystems, wetlands have a wide area of alternating land and water zones and complex vegetation composition, making it challenging to achieve dynamic displays of virtual wetland scenes using three-dimensional modeling. This study proposes a workflow of game engine-b...

Full description

Bibliographic Details
Main Authors: Sheng Lu, Chaoyang Fang, Xin Xiao
Format: Article
Language:English
Published: MDPI AG 2023-02-01
Series:ISPRS International Journal of Geo-Information
Subjects:
Online Access:https://www.mdpi.com/2220-9964/12/2/49
Description
Summary:Due to the complexity of wetland ecosystems, wetlands have a wide area of alternating land and water zones and complex vegetation composition, making it challenging to achieve dynamic displays of virtual wetland scenes using three-dimensional modeling. This study proposes a workflow of game engine-based virtual wetland scene construction for the rapid modeling of virtual wetland scenes. The virtual wetland scene construction work utilized Poyang Lake as the primary research area. It integrated unmanned aerial vehicle data collection technology and geographic information technology with 3D (three-dimensional) modeling of wetland elements and scene program modeling of the game engine to complete the construction and dynamic development of virtual wetland scenes. In addition, it used various virtual reality technologies to display the virtual wetland scene. The virtual scene of Poyang Lake combined with actual data was more realistic and had higher simulation. In reality, the digital wetland scene of Poyang Lake realizes multiple forms of virtual experience and provides users with a profoundly immersive virtual experience. This comprehensive virtual scene workflow in the study can serve as a technical resource for building 3D scenes. It can also provide a technical reference for the digital twin watershed project of Poyang Lake, which has practical application value.
ISSN:2220-9964