Exploiting Compositionality to Explore a Large Space of Model Structures

The recent proliferation of richly structured probabilistic models raises the question of how to automatically determine an appropriate model for a dataset. We investigate this question for a space of matrix decomposition models which can express a variety of widely used models from unsupervised lea...

Full description

Bibliographic Details
Main Authors: Grosse, Roger Baker, Salakhutdinov, Ruslan, Freeman, William T., Tenenbaum, Joshua B.
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:en_US
Published: AUAI Press 2014
Online Access:http://hdl.handle.net/1721.1/86219
https://orcid.org/0000-0002-1925-2035
https://orcid.org/0000-0002-2231-7995
_version_ 1826208792800395264
author Grosse, Roger Baker
Salakhutdinov, Ruslan
Freeman, William T.
Tenenbaum, Joshua B.
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Grosse, Roger Baker
Salakhutdinov, Ruslan
Freeman, William T.
Tenenbaum, Joshua B.
author_sort Grosse, Roger Baker
collection MIT
description The recent proliferation of richly structured probabilistic models raises the question of how to automatically determine an appropriate model for a dataset. We investigate this question for a space of matrix decomposition models which can express a variety of widely used models from unsupervised learning. To enable model selection, we organize these models into a context-free grammar which generates a wide variety of structures through the compositional application of a few simple rules. We use our grammar to generically and efficiently infer latent components and estimate predictive likelihood for nearly 2500 structures using a small toolbox of reusable algorithms. Using a greedy search over our grammar, we automatically choose the decomposition structure from raw data by evaluating only a small fraction of all models. The proposed method typically finds the correct structure for synthetic data and backs off gracefully to simpler models under heavy noise. It learns sensible structures for datasets as diverse as image patches, motion capture, 20 Questions, and U.S. Senate votes, all using exactly the same code.
first_indexed 2024-09-23T14:12:43Z
format Article
id mit-1721.1/86219
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T14:12:43Z
publishDate 2014
publisher AUAI Press
record_format dspace
spelling mit-1721.1/862192022-10-01T19:45:22Z Exploiting Compositionality to Explore a Large Space of Model Structures Grosse, Roger Baker Salakhutdinov, Ruslan Freeman, William T. Tenenbaum, Joshua B. Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Grosse, Roger Baker Freeman, William T. Tenenbaum, Joshua B. The recent proliferation of richly structured probabilistic models raises the question of how to automatically determine an appropriate model for a dataset. We investigate this question for a space of matrix decomposition models which can express a variety of widely used models from unsupervised learning. To enable model selection, we organize these models into a context-free grammar which generates a wide variety of structures through the compositional application of a few simple rules. We use our grammar to generically and efficiently infer latent components and estimate predictive likelihood for nearly 2500 structures using a small toolbox of reusable algorithms. Using a greedy search over our grammar, we automatically choose the decomposition structure from raw data by evaluating only a small fraction of all models. The proposed method typically finds the correct structure for synthetic data and backs off gracefully to simpler models under heavy noise. It learns sensible structures for datasets as diverse as image patches, motion capture, 20 Questions, and U.S. Senate votes, all using exactly the same code. United States. Army Research Office (ARO grant W911NF-08-1-0242) American Society for Engineering Education. National Defense Science and Engineering Graduate Fellowship 2014-04-23T16:23:51Z 2014-04-23T16:23:51Z 2012-08 Article http://purl.org/eprint/type/ConferencePaper 978-0-9749039-8-9 http://hdl.handle.net/1721.1/86219 Grosse, Roger B., Ruslan Salakhutdinov, William T. Freeman, and Joshua B. Tenenbaum. "Exploiting Compositionality to Explore a Large Space of Model Structures." In 28th Conference on Uncertainly in Artificial Intelligence (2012), Catalina Island, United States, August 15-17, 2012. AUAI Press, pp. 306-315. https://orcid.org/0000-0002-1925-2035 https://orcid.org/0000-0002-2231-7995 en_US http://www.auai.org/uai2012/proceedings.pdf Proceedings of the 28th Conference on Uncertainly in Artificial Intelligence (2012) Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf AUAI Press MIT web domain
spellingShingle Grosse, Roger Baker
Salakhutdinov, Ruslan
Freeman, William T.
Tenenbaum, Joshua B.
Exploiting Compositionality to Explore a Large Space of Model Structures
title Exploiting Compositionality to Explore a Large Space of Model Structures
title_full Exploiting Compositionality to Explore a Large Space of Model Structures
title_fullStr Exploiting Compositionality to Explore a Large Space of Model Structures
title_full_unstemmed Exploiting Compositionality to Explore a Large Space of Model Structures
title_short Exploiting Compositionality to Explore a Large Space of Model Structures
title_sort exploiting compositionality to explore a large space of model structures
url http://hdl.handle.net/1721.1/86219
https://orcid.org/0000-0002-1925-2035
https://orcid.org/0000-0002-2231-7995
work_keys_str_mv AT grosserogerbaker exploitingcompositionalitytoexplorealargespaceofmodelstructures
AT salakhutdinovruslan exploitingcompositionalitytoexplorealargespaceofmodelstructures
AT freemanwilliamt exploitingcompositionalitytoexplorealargespaceofmodelstructures
AT tenenbaumjoshuab exploitingcompositionalitytoexplorealargespaceofmodelstructures