Quantifying Nonlocal Informativeness in High-Dimensional, Loopy Gaussian Graphical Models

We consider the problem of selecting informative observations in Gaussian graphical models containing both cycles and nuisances. More specifically, we consider the subproblem of quantifying conditional mutual information measures that are nonlocal on such graphs. The ability to efficiently quantify...

Full description

Bibliographic Details
Main Authors: Levine, Daniel, How, Jonathan P.
Other Authors: Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Format: Article
Language:en_US
Published: Association of Uncertainty in Artifical Intelligence 2015
Online Access:http://hdl.handle.net/1721.1/96957
https://orcid.org/0000-0001-8576-1930
Description
Summary:We consider the problem of selecting informative observations in Gaussian graphical models containing both cycles and nuisances. More specifically, we consider the subproblem of quantifying conditional mutual information measures that are nonlocal on such graphs. The ability to efficiently quantify the information content of observations is crucial for resource-constrained data acquisition (adaptive sampling) and data processing (active learning) systems. While closed-form expressions for Gaussian mutual information exist, standard linear algebraic techniques, with complexity cubic in the network size, are intractable for high-dimensional distributions. We investigate the use of embedded trees for computing nonlocal pairwise mutual information and demonstrate through numerical simulations that the presented approach achieves a significant reduction in computational cost over inversion-based methods.