Interval Entropy and Informative Distance

The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measu...

Full description

Bibliographic Details
Main Authors: Fakhroddin Misagh, Gholamhossein Yari
Format: Article
Language:English
Published: MDPI AG 2012-03-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/14/3/480/
_version_ 1828364266937253888
author Fakhroddin Misagh
Gholamhossein Yari
author_facet Fakhroddin Misagh
Gholamhossein Yari
author_sort Fakhroddin Misagh
collection DOAJ
description The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
first_indexed 2024-04-14T05:13:16Z
format Article
id doaj.art-0a5cc878a778410abd69333777b465b5
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-14T05:13:16Z
publishDate 2012-03-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-0a5cc878a778410abd69333777b465b52022-12-22T02:10:28ZengMDPI AGEntropy1099-43002012-03-0114348049010.3390/e14030480Interval Entropy and Informative DistanceFakhroddin MisaghGholamhossein YariThe Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.http://www.mdpi.com/1099-4300/14/3/480/uncertaintydiscrepancycharacterization
spellingShingle Fakhroddin Misagh
Gholamhossein Yari
Interval Entropy and Informative Distance
Entropy
uncertainty
discrepancy
characterization
title Interval Entropy and Informative Distance
title_full Interval Entropy and Informative Distance
title_fullStr Interval Entropy and Informative Distance
title_full_unstemmed Interval Entropy and Informative Distance
title_short Interval Entropy and Informative Distance
title_sort interval entropy and informative distance
topic uncertainty
discrepancy
characterization
url http://www.mdpi.com/1099-4300/14/3/480/
work_keys_str_mv AT fakhroddinmisagh intervalentropyandinformativedistance
AT gholamhosseinyari intervalentropyandinformativedistance