When Does Syntax Mediate Neural Language Model Performance? Evidence from Dropout Probes
Main Authors: | Tucker, Mycal, Eisape, Tiwalayo, Qian, Peng, Levy, Roger, Shah, Julie |
---|---|
Other Authors: | Massachusetts Institute of Technology. Department of Brain and Cognitive Sciences |
Format: | Article |
Language: | English |
Published: |
Association for Computational Linguistics (ACL)
2023
|
Online Access: | https://hdl.handle.net/1721.1/150010 |
Similar Items
-
Cloze Distillation: Improving Neural Language Models with Human Next-Word Prediction
by: Eisape, Tiwalayo, et al.
Published: (2022) -
Cloze Distillation: Improving Neural Language Models with Human Next-Word Prediction
by: Eisape, Tiwalayo, et al.
Published: (2021) -
What if This Modified That? Syntactic Interventions with Counterfactual Embeddings
by: Tucker, Mycal, et al.
Published: (2023) -
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
by: Gauthier, Jon, et al.
Published: (2021) -
SyntaxGym: An Online Platform for Targeted Evaluation of Language Models
by: Gauthier, Jon, et al.
Published: (2022)