Schrödinger's tree—On syntax and neural language models
In the last half-decade, the field of natural language processing (NLP) has undergone two major transitions: the switch to neural networks as the primary modeling paradigm and the homogenization of the training regime (pre-train, then fine-tune). Amidst this process, language models have emerged as...
Main Authors: | Artur Kulmizev, Joakim Nivre |
---|---|
Format: | Article |
Language: | English |
Published: |
Frontiers Media S.A.
2022-10-01
|
Series: | Frontiers in Artificial Intelligence |
Subjects: | |
Online Access: | https://www.frontiersin.org/articles/10.3389/frai.2022.796788/full |
Similar Items
-
Natural Syntax, Artificial Intelligence and Language Acquisition
by: William O’Grady, et al.
Published: (2023-07-01) -
Transitivity in natural syntax : ergative languages
by: Janez Orešnik
Published: (2009-12-01) -
Causal/Temporal Connectives: Syntax and Lexicon
by: Brent, Michael R.
Published: (2004) -
Menzerath’s Law in the Syntax of Languages Compared with Random Sentences
by: Kumiko Tanaka-Ishii
Published: (2021-05-01) -
Neurobiology of Syntax as the Core of Human Language
by: Angela Dorkas Friederici
Published: (2017-12-01)