Zipf’s law holds for phrases, not words

With Zipf’s law being originally and most famously observed for word frequency, it is surprisingly limited in its applicability to human language, holding over no more than three to four orders of magnitude before hitting a clear break in scaling. Here, building on the simple observation that phrase...

Full description

Bibliographic Details
Main Authors: Ryland Williams, Jake, Lessard, Paul R., Desu, Suma, Clark, Eric M., Bagrow, James P., Danforth, Christopher M., Sheridan Dodds, Peter
Other Authors: Massachusetts Institute of Technology. Center for Computational Engineering
Format: Article
Language:en_US
Published: Nature Publishing Group 2015
Online Access:http://hdl.handle.net/1721.1/98434
Description
Summary:With Zipf’s law being originally and most famously observed for word frequency, it is surprisingly limited in its applicability to human language, holding over no more than three to four orders of magnitude before hitting a clear break in scaling. Here, building on the simple observation that phrases of one or more words comprise the most coherent units of meaning in language, we show empirically that Zipf’s law for phrases extends over as many as nine orders of rank magnitude. In doing so, we develop a principled and scalable statistical mechanical method of random text partitioning, which opens up a rich frontier of rigorous text analysis via a rank ordering of mixed length phrases.