Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations

Work using artificial languages as training input has shown that LSTMs are capable of inducing the stack-like data structures required to represent context-free and certain mildly context-sensitive languages — formal language classes which correspond in theory to the hierarchical structures of natur...

Full description

Bibliographic Details
Main Authors: Wilcox, Ethan, Levy, Roger P, Futrell, Richard
Other Authors: Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences
Format: Article
Language:English
Published: Association for Computational Linguistics 2021
Online Access:https://hdl.handle.net/1721.1/130418