An Alternative to Entropy in the Measurement of Information

Abstract: Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical in...

Full description

Bibliographic Details
Main Author: Marcin J. Schroeder
Format: Article
Language:English
Published: MDPI AG 2004-12-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/6/5/388/
_version_ 1818037121797062656
author Marcin J. Schroeder
author_facet Marcin J. Schroeder
author_sort Marcin J. Schroeder
collection DOAJ
description Abstract: Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.
first_indexed 2024-12-10T07:21:49Z
format Article
id doaj.art-b944945292fb4d5f99d502f61a72f263
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-12-10T07:21:49Z
publishDate 2004-12-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-b944945292fb4d5f99d502f61a72f2632022-12-22T01:57:47ZengMDPI AGEntropy1099-43002004-12-016538841210.3390/e6050388An Alternative to Entropy in the Measurement of InformationMarcin J. SchroederAbstract: Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.http://www.mdpi.com/1099-4300/6/5/388/entropymeasures of informationinformation theorysemantics of information
spellingShingle Marcin J. Schroeder
An Alternative to Entropy in the Measurement of Information
Entropy
entropy
measures of information
information theory
semantics of information
title An Alternative to Entropy in the Measurement of Information
title_full An Alternative to Entropy in the Measurement of Information
title_fullStr An Alternative to Entropy in the Measurement of Information
title_full_unstemmed An Alternative to Entropy in the Measurement of Information
title_short An Alternative to Entropy in the Measurement of Information
title_sort alternative to entropy in the measurement of information
topic entropy
measures of information
information theory
semantics of information
url http://www.mdpi.com/1099-4300/6/5/388/
work_keys_str_mv AT marcinjschroeder analternativetoentropyinthemeasurementofinformation
AT marcinjschroeder alternativetoentropyinthemeasurementofinformation