Compositionality in rational analysis: Grammar-based induction for concept learning

This chapter provides a range of conceptual and technical insights into how this project can be attempted - and goes some way to suggesting that probabilistic methods need not be viewed as inevitably unable to capture the richness and complexity of world knowledge. It argues that structured represen...

Full description

Bibliographic Details
Main Authors: Goodman, Noah D., Tenenbaum, Joshua B., Griffiths, Thomas L., Feldman, Jacob
Other Authors: Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
Format: Article
Language:English
Published: Oxford University Press 2020
Online Access:https://hdl.handle.net/1721.1/124810
Description
Summary:This chapter provides a range of conceptual and technical insights into how this project can be attempted - and goes some way to suggesting that probabilistic methods need not be viewed as inevitably unable to capture the richness and complexity of world knowledge. It argues that structured representations, generated by a formal grammar, can be appropriate units over which probabilistic information can be represented and learned. This topic is likely to be one of the main challenges for probabilistic research in cognitive science and artificial intelligence over the coming decades. Keywords: probabilistic research; knowledge; grammar; concept learning; cognitive science; artificial intelligence