Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues

<jats:title>Abstract</jats:title> <jats:p>A variety of image generation methods have emerged in recent years, notably DALL-E 2, Imagen and Stable Diffusion. While they have been shown to be capable of producing photorealistic images from text prompts facilitated by...

Full description

Bibliographic Details
Main Author: Buehler, Markus J
Other Authors: Massachusetts Institute of Technology. Department of Civil and Environmental Engineering
Format: Article
Language:English
Published: Oxford University Press (OUP) 2023
Online Access:https://hdl.handle.net/1721.1/148576
_version_ 1826188972018106368
author Buehler, Markus J
author2 Massachusetts Institute of Technology. Department of Civil and Environmental Engineering
author_facet Massachusetts Institute of Technology. Department of Civil and Environmental Engineering
Buehler, Markus J
author_sort Buehler, Markus J
collection MIT
description <jats:title>Abstract</jats:title> <jats:p>A variety of image generation methods have emerged in recent years, notably DALL-E 2, Imagen and Stable Diffusion. While they have been shown to be capable of producing photorealistic images from text prompts facilitated by generative diffusion models conditioned on language input, their capacity for materials design has not yet been explored. Here, we use a trained Stable Diffusion model and consider it as an experimental system, examining its capacity to generate novel material designs especially in the context of 3D material architectures. We demonstrate that this approach offers a paradigm to generate diverse material patterns and designs, using human-readable language as input, allowing us to explore a vast nature-inspired design portfolio for both novel architectured materials and granular media. We present a series of methods to translate 2D representations into 3D data, including movements through noise spaces via mixtures of text prompts, and image conditioning. We create physical samples using additive manufacturing and assess material properties of materials designed via a coarse-grained particle simulation approach. We present case studies using images as starting point for material generation; exemplified in two applications. First, a design for which we use Haeckel’s classic lithographic print of a diatom, which we amalgamate with a spider web. Second, a design that is based on the image of a flame, amalgamating it with a hybrid of a spider web and wood structures. These design approaches result in complex materials forming solids or granular liquid-like media that can ultimately be tuned to meet target demands.</jats:p>
first_indexed 2024-09-23T08:07:33Z
format Article
id mit-1721.1/148576
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T08:07:33Z
publishDate 2023
publisher Oxford University Press (OUP)
record_format dspace
spelling mit-1721.1/1485762023-03-17T03:52:08Z Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues Buehler, Markus J Massachusetts Institute of Technology. Department of Civil and Environmental Engineering <jats:title>Abstract</jats:title> <jats:p>A variety of image generation methods have emerged in recent years, notably DALL-E 2, Imagen and Stable Diffusion. While they have been shown to be capable of producing photorealistic images from text prompts facilitated by generative diffusion models conditioned on language input, their capacity for materials design has not yet been explored. Here, we use a trained Stable Diffusion model and consider it as an experimental system, examining its capacity to generate novel material designs especially in the context of 3D material architectures. We demonstrate that this approach offers a paradigm to generate diverse material patterns and designs, using human-readable language as input, allowing us to explore a vast nature-inspired design portfolio for both novel architectured materials and granular media. We present a series of methods to translate 2D representations into 3D data, including movements through noise spaces via mixtures of text prompts, and image conditioning. We create physical samples using additive manufacturing and assess material properties of materials designed via a coarse-grained particle simulation approach. We present case studies using images as starting point for material generation; exemplified in two applications. First, a design for which we use Haeckel’s classic lithographic print of a diatom, which we amalgamate with a spider web. Second, a design that is based on the image of a flame, amalgamating it with a hybrid of a spider web and wood structures. These design approaches result in complex materials forming solids or granular liquid-like media that can ultimately be tuned to meet target demands.</jats:p> 2023-03-16T13:39:55Z 2023-03-16T13:39:55Z 2022-02-09 2023-03-16T13:34:19Z Article http://purl.org/eprint/type/JournalArticle https://hdl.handle.net/1721.1/148576 Buehler, Markus J. 2022. "Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues." Oxford Open Materials Science, 2 (1). en 10.1093/oxfmat/itac010 Oxford Open Materials Science Creative Commons Attribution 4.0 International license https://creativecommons.org/licenses/by/4.0/ application/pdf Oxford University Press (OUP) Oxford University Press
spellingShingle Buehler, Markus J
Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title_full Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title_fullStr Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title_full_unstemmed Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title_short Generating 3D architectured nature-inspired materials and granular media using diffusion models based on language cues
title_sort generating 3d architectured nature inspired materials and granular media using diffusion models based on language cues
url https://hdl.handle.net/1721.1/148576
work_keys_str_mv AT buehlermarkusj generating3darchitecturednatureinspiredmaterialsandgranularmediausingdiffusionmodelsbasedonlanguagecues