Languages with more speakers tend to be harder to (machine-)learn
Abstract Computational language models (LMs), most notably exemplified by the widespread success of OpenAI's ChatGPT chatbot, show impressive performance on a wide range of linguistic tasks, thus providing cognitive science and linguistics with a computational working model to empirically study...
Main Authors: | Alexander Koplenig, Sascha Wolfer |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2023-10-01
|
Series: | Scientific Reports |
Online Access: | https://doi.org/10.1038/s41598-023-45373-z |
Similar Items
-
Language structure is influenced by the number of speakers but seemingly not by the proportion of non-native speakers
by: Alexander Koplenig
Published: (2019-02-01) -
A large quantitative analysis of written language challenges the idea that all languages are equally complex
by: Alexander Koplenig, et al.
Published: (2023-09-01) -
Adaptive Communication: Languages with More Non-Native Speakers Tend to Have Fewer Word Forms.
by: Christian Bentz, et al.
Published: (2015-01-01) -
Studying Lexical Dynamics and Language Change via Generalized Entropies: The Problem of Sample Size
by: Alexander Koplenig, et al.
Published: (2019-05-01) -
Diamond gets harder, tougher, and more deformable
by: Bo Xu, et al.
Published: (2020-11-01)