A computational sandbox with human automata for exploring perceived egress safety in urban damage scenarios

In earthquakes and building collapse situations, volumes of people may need to move, suddenly, through spaces that have been destroyed and seem unfamiliar in configuration and appearance. Perception is significant in these cases, determining individual movement, collective egress, and phenomena in b...

Full description

Bibliographic Details
Main Author: Paul M. Torrens
Format: Article
Language:English
Published: Taylor & Francis Group 2018-04-01
Series:International Journal of Digital Earth
Subjects:
Online Access:http://dx.doi.org/10.1080/17538947.2017.1320594
Description
Summary:In earthquakes and building collapse situations, volumes of people may need to move, suddenly, through spaces that have been destroyed and seem unfamiliar in configuration and appearance. Perception is significant in these cases, determining individual movement, collective egress, and phenomena in between. Alas, exploring how perception shapes critical egress is tricky because perception is both physical and cerebral in genesis and because critical scenarios are often hazardous. In this paper, we describe a computational sandbox for studying urban damage scenarios. The model is built as automata, specialized as human automata and rigid body automata, with interactivity provided by slipstreaming. Our sandbox supports parameterization of synthetic built settings and synthetic humans in fine detail for large interactive collections, allowing flexible analyses of damage scenarios and their determining processes, from micro-perspectives through to the macrocosm of the phenomena that might result. While we have much work to do to improve the model relative to real-world fidelity, our work thus far has produced some meaningful results, supporting practical questions of how urban design and parking scenarios shape egress, and pointing to potential phenomena of perceptual shadowing as a translation mechanism for processes at the built-human interface.
ISSN:1753-8947
1753-8955