Umwelt: Accessible Structured Editing of Multi-Modal Data Representations

CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems May 11–16, 2024, Honolulu, HI, USA

Bibliographic Details
Main Authors: Zong, Jonathan, Pedraza Pineros, Isabella, Chen, Mengzhu (Katie), Hajas, Daniel, Satyanarayan, Arvind
Other Authors: MIT Morningside Academy for Design
Format: Article
Language:English
Published: ACM 2024
Online Access:https://hdl.handle.net/1721.1/155166
_version_ 1824458161153638400
author Zong, Jonathan
Pedraza Pineros, Isabella
Chen, Mengzhu (Katie)
Hajas, Daniel
Satyanarayan, Arvind
author2 MIT Morningside Academy for Design
author_facet MIT Morningside Academy for Design
Zong, Jonathan
Pedraza Pineros, Isabella
Chen, Mengzhu (Katie)
Hajas, Daniel
Satyanarayan, Arvind
author_sort Zong, Jonathan
collection MIT
description CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems May 11–16, 2024, Honolulu, HI, USA
first_indexed 2024-09-23T12:26:12Z
format Article
id mit-1721.1/155166
institution Massachusetts Institute of Technology
language English
last_indexed 2025-02-19T04:21:29Z
publishDate 2024
publisher ACM
record_format dspace
spelling mit-1721.1/1551662025-01-04T05:19:16Z Umwelt: Accessible Structured Editing of Multi-Modal Data Representations Zong, Jonathan Pedraza Pineros, Isabella Chen, Mengzhu (Katie) Hajas, Daniel Satyanarayan, Arvind MIT Morningside Academy for Design Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems May 11–16, 2024, Honolulu, HI, USA We present Umwelt, an authoring environment for interactive multimodal data representations. In contrast to prior approaches, which center the visual modality, Umwelt treats visualization, sonification, and textual description as coequal representations: they are all derived from a shared abstract data model, such that no modality is prioritized over the others. To simplify specification, Umwelt evaluates a set of heuristics to generate default multimodal representations that express a dataset’s functional relationships. To support smoothly moving between representations, Umwelt maintains a shared query predicated that is reified across all modalities — for instance, navigating the textual description also highlights the visualization and filters the sonification. In a study with 5 blind / low-vision expert users, we found that Umwelt’s multimodal representations afforded complementary overview and detailed perspectives on a dataset, allowing participants to fluidly shift between task- and representation-oriented ways of thinking. 2024-06-03T20:00:16Z 2024-06-03T20:00:16Z 2024-05-11 2024-06-01T07:48:39Z Article http://purl.org/eprint/type/ConferencePaper 979-8-4007-0330-0 https://hdl.handle.net/1721.1/155166 Zong, Jonathan, Pedraza Pineros, Isabella, Chen, Mengzhu (Katie), Hajas, Daniel and Satyanarayan, Arvind. 2024. "Umwelt: Accessible Structured Editing of Multi-Modal Data Representations." PUBLISHER_CC en 10.1145/3613904.3641996 Creative Commons Attribution https://creativecommons.org/licenses/by/4.0/ The author(s) application/pdf ACM Association for Computing Machinery
spellingShingle Zong, Jonathan
Pedraza Pineros, Isabella
Chen, Mengzhu (Katie)
Hajas, Daniel
Satyanarayan, Arvind
Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title_full Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title_fullStr Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title_full_unstemmed Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title_short Umwelt: Accessible Structured Editing of Multi-Modal Data Representations
title_sort umwelt accessible structured editing of multi modal data representations
url https://hdl.handle.net/1721.1/155166
work_keys_str_mv AT zongjonathan umweltaccessiblestructurededitingofmultimodaldatarepresentations
AT pedrazapinerosisabella umweltaccessiblestructurededitingofmultimodaldatarepresentations
AT chenmengzhukatie umweltaccessiblestructurededitingofmultimodaldatarepresentations
AT hajasdaniel umweltaccessiblestructurededitingofmultimodaldatarepresentations
AT satyanarayanarvind umweltaccessiblestructurededitingofmultimodaldatarepresentations