Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery
It is a task of widespread interest to learn the underlying causal structure for systems of random variables. Entropic Causal Inference is a recent framework for learning the causal graph between two variables from observational data (i.e., without experiments) by finding the information-theoretical...
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2022
|
Online Access: | https://hdl.handle.net/1721.1/145148 |
_version_ | 1811078555749056512 |
---|---|
author | Compton, Spencer |
author2 | Uhler, Caroline |
author_facet | Uhler, Caroline Compton, Spencer |
author_sort | Compton, Spencer |
collection | MIT |
description | It is a task of widespread interest to learn the underlying causal structure for systems of random variables. Entropic Causal Inference is a recent framework for learning the causal graph between two variables from observational data (i.e., without experiments) by finding the information-theoretically simplest structural explanation of the data. In this thesis, we develop theoretical techniques that enable us to show how Entropic Causal Inference permits learnability of causal graphs with particular information-theoretically simple structure. We show the first theoretical guarantee for finite-sample learnability with Entropic Causal Inference for pairs of random variables. Later, we extend this guarantee to show the first result for Entropic Causal Inference in systems with more than two variables: proving learnability of general directed acyclic graphs over many variables (under assumptions on the generative process). We implement and experimentally evaluate Entropic Causal Inference on synthetic and real-world causal systems. Moreover, we improve the best-known approximation guarantee for the Minimum Entropy Coupling problem. This information-theoretic algorithmic problem has direct relevance to Entropic Causal Inference and is also of independent interest. In totality, this thesis develops algorithmic and information-theoretic tools that shed light on how information-theoretic properties enable learning of causal graphs from both a practical and theoretical perspective. |
first_indexed | 2024-09-23T11:02:03Z |
format | Thesis |
id | mit-1721.1/145148 |
institution | Massachusetts Institute of Technology |
last_indexed | 2024-09-23T11:02:03Z |
publishDate | 2022 |
publisher | Massachusetts Institute of Technology |
record_format | dspace |
spelling | mit-1721.1/1451482022-08-30T03:44:55Z Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery Compton, Spencer Uhler, Caroline Greenewald, Kristjan Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science It is a task of widespread interest to learn the underlying causal structure for systems of random variables. Entropic Causal Inference is a recent framework for learning the causal graph between two variables from observational data (i.e., without experiments) by finding the information-theoretically simplest structural explanation of the data. In this thesis, we develop theoretical techniques that enable us to show how Entropic Causal Inference permits learnability of causal graphs with particular information-theoretically simple structure. We show the first theoretical guarantee for finite-sample learnability with Entropic Causal Inference for pairs of random variables. Later, we extend this guarantee to show the first result for Entropic Causal Inference in systems with more than two variables: proving learnability of general directed acyclic graphs over many variables (under assumptions on the generative process). We implement and experimentally evaluate Entropic Causal Inference on synthetic and real-world causal systems. Moreover, we improve the best-known approximation guarantee for the Minimum Entropy Coupling problem. This information-theoretic algorithmic problem has direct relevance to Entropic Causal Inference and is also of independent interest. In totality, this thesis develops algorithmic and information-theoretic tools that shed light on how information-theoretic properties enable learning of causal graphs from both a practical and theoretical perspective. M.Eng. 2022-08-29T16:36:27Z 2022-08-29T16:36:27Z 2022-05 2022-05-27T16:19:26.278Z Thesis https://hdl.handle.net/1721.1/145148 In Copyright - Educational Use Permitted Copyright MIT http://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology |
spellingShingle | Compton, Spencer Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title | Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title_full | Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title_fullStr | Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title_full_unstemmed | Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title_short | Information-Theoretic Algorithms and Identifiability for Causal Graph Discovery |
title_sort | information theoretic algorithms and identifiability for causal graph discovery |
url | https://hdl.handle.net/1721.1/145148 |
work_keys_str_mv | AT comptonspencer informationtheoreticalgorithmsandidentifiabilityforcausalgraphdiscovery |