Evaluating Computational Language Models with Scaling Properties of Natural Language
In this article, we evaluate computational models of natural language with respect to the universal statistical behaviors of natural language. Statistical mechanical analyses have revealed that natural language text is characterized by scaling properties, which quantify the global structure in the v...
Main Authors: | Takahashi, Shuntaro, Tanaka-Ishii, Kumiko |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2019-09-01
|
Series: | Computational Linguistics |
Online Access: | https://www.mitpressjournals.org/doi/abs/10.1162/coli_a_00355 |
Similar Items
-
Do neural nets learn statistical laws behind natural language?
by: Shuntaro Takahashi, et al.
Published: (2017-01-01) -
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
by: Shuntaro Takahashi, et al.
Published: (2018-11-01) -
Entropy Rate Estimates for Natural Language—A New Extrapolation of Compressed Large-Scale Corpora
by: Ryosuke Takahira, et al.
Published: (2016-10-01) -
Menzerath’s Law in the Syntax of Languages Compared with Random Sentences
by: Kumiko Tanaka-Ishii
Published: (2021-05-01) -
Computational models of natural language processing /
by: Bara, Bruno G., et al.
Published: (1984)