Multi-dimensional computational imaging from diffraction intensity using deep neural networks

Diffraction of light can be found everywhere in nature, from sunlight rays fanning out from clouds to multiple colors reflected from the surface of a CD. This phenomenon of light explains any change in the path of light due to an obstacle and is of particular significance as it allows us to see tran...

Full description

Bibliographic Details
Main Author: Kang, Iksung
Other Authors: Barbastathis, George
Format: Thesis
Published: Massachusetts Institute of Technology 2022
Online Access:https://hdl.handle.net/1721.1/144925
https://orcid.org/0000-0002-4009-6743
_version_ 1811078219015651328
author Kang, Iksung
author2 Barbastathis, George
author_facet Barbastathis, George
Kang, Iksung
author_sort Kang, Iksung
collection MIT
description Diffraction of light can be found everywhere in nature, from sunlight rays fanning out from clouds to multiple colors reflected from the surface of a CD. This phenomenon of light explains any change in the path of light due to an obstacle and is of particular significance as it allows us to see transparent (or pure-phase) objects, e.g. biological cells under visible-wavelength light or integrated circuits under X-rays, with proper exploitation of the phenomenon. However, cameras only measure the intensity of the diffracted light, which makes the camera measurements incomplete due to the loss of phase information. Thus, this thesis addresses the reconstruction of multi-dimensional phase information from diffraction intensities with a regularized inversion using deep neural networks for two- and three-dimensional applications. The inversion process begins with the definition of a forward physical model that relates a diffraction intensity to a phase object and then involves a physics-informing step (or equivalently, physics prior) to deep neural networks, if applicable. In this thesis, two-dimensional wavefront aberrations are retrieved for high-contrast imaging of exoplanets using a deep residual neural network, and transparent planar objects behind dynamic scattering media are revealed by a recurrent neural network, both in an end-to-end training fashion. Next, a multi-layered, three-dimensional glass phantom of integrated circuits is reconstructed under the limited-angle phase computed tomography geometry with visible-wavelength laser illumination using a dynamical machine learning framework. Furthermore, a deep neural network regularization is deployed for the reconstruction of real integrated circuits from far-field diffraction intensities under the ptychographic X-ray computed tomography geometry with partially coherent synchrotron X-ray illumination.
first_indexed 2024-09-23T10:55:58Z
format Thesis
id mit-1721.1/144925
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T10:55:58Z
publishDate 2022
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1449252022-08-30T03:45:44Z Multi-dimensional computational imaging from diffraction intensity using deep neural networks Kang, Iksung Barbastathis, George Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Diffraction of light can be found everywhere in nature, from sunlight rays fanning out from clouds to multiple colors reflected from the surface of a CD. This phenomenon of light explains any change in the path of light due to an obstacle and is of particular significance as it allows us to see transparent (or pure-phase) objects, e.g. biological cells under visible-wavelength light or integrated circuits under X-rays, with proper exploitation of the phenomenon. However, cameras only measure the intensity of the diffracted light, which makes the camera measurements incomplete due to the loss of phase information. Thus, this thesis addresses the reconstruction of multi-dimensional phase information from diffraction intensities with a regularized inversion using deep neural networks for two- and three-dimensional applications. The inversion process begins with the definition of a forward physical model that relates a diffraction intensity to a phase object and then involves a physics-informing step (or equivalently, physics prior) to deep neural networks, if applicable. In this thesis, two-dimensional wavefront aberrations are retrieved for high-contrast imaging of exoplanets using a deep residual neural network, and transparent planar objects behind dynamic scattering media are revealed by a recurrent neural network, both in an end-to-end training fashion. Next, a multi-layered, three-dimensional glass phantom of integrated circuits is reconstructed under the limited-angle phase computed tomography geometry with visible-wavelength laser illumination using a dynamical machine learning framework. Furthermore, a deep neural network regularization is deployed for the reconstruction of real integrated circuits from far-field diffraction intensities under the ptychographic X-ray computed tomography geometry with partially coherent synchrotron X-ray illumination. Ph.D. 2022-08-29T16:21:21Z 2022-08-29T16:21:21Z 2022-05 2022-06-21T19:15:59.764Z Thesis https://hdl.handle.net/1721.1/144925 https://orcid.org/0000-0002-4009-6743 In Copyright - Educational Use Permitted Copyright MIT http://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology
spellingShingle Kang, Iksung
Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title_full Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title_fullStr Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title_full_unstemmed Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title_short Multi-dimensional computational imaging from diffraction intensity using deep neural networks
title_sort multi dimensional computational imaging from diffraction intensity using deep neural networks
url https://hdl.handle.net/1721.1/144925
https://orcid.org/0000-0002-4009-6743
work_keys_str_mv AT kangiksung multidimensionalcomputationalimagingfromdiffractionintensityusingdeepneuralnetworks