Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
Work using artificial languages as training input has shown that LSTMs are capable of inducing the stack-like data structures required to represent context-free and certain mildly context-sensitive languages — formal language classes which correspond in theory to the hierarchical structures of natur...
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics
2021
|
Online Access: | https://hdl.handle.net/1721.1/130418 |