Representation of Constituents in Neural Language Models: Coordination Phrase as a Case Study
© 2019 Association for Computational Linguistics Neural language models have achieved state-of-the-art performances on many NLP tasks, and recently have been shown to learn a number of hierarchically-sensitive syntactic dependencies between individual words. However, equally important for language p...
Main Authors: | An, Aixiu, Qian, Peng, Wilcox, Ethan, Levy, Roger |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics
2021
|
Online Access: | https://hdl.handle.net/1721.1/137251 |
Similar Items
-
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
by: Wilcox, Ethan, et al.
Published: (2021) -
Neural language models as psycholinguistic subjects: Representations of syntactic state
by: Qian, Peng, et al.
Published: (2021) -
A Targeted Assessment of Incremental Processing in Neural Language Models and Humans
by: Wilcox, Ethan, et al.
Published: (2023) -
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
by: Wilcox, Ethan, et al.
Published: (2021) -
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
by: Gauthier, Jon, et al.
Published: (2021)