On a simple derivation of a family of nonextensive entropies from information content
Abstract: The nonextensive entropy of Tsallis can be seen as a consequence of postulates on a self-information, i.e., the constant ratio of the first derivative of a self-information per unit probability to the curvature (second variation) of it. This constancy holds if we regard the probability dis...
Main Author: | Takuya Yamano |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2004-08-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/6/4/364/ |
Similar Items
-
A Possible Extension of Shannon's Information Theory
by: Takuya Yamano
Published: (2001-11-01) -
Entropy
by: Constantino Tsallis
Published: (2022-01-01) -
Enthusiasm and Skepticism: Two Pillars of Science—A Nonextensive Statistics Case
by: Constantino Tsallis
Published: (2022-05-01) -
Black Hole Entropy: A Closer Look
by: Constantino Tsallis
Published: (2019-12-01) -
Reply to Pessoa, P.; Arderucio Costa, B. Comment on “Tsallis, C. Black Hole Entropy: A Closer Look. <i>Entropy</i> 2020, <i>22</i>, 17”
by: Constantino Tsallis
Published: (2021-05-01)