An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesi...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2016-02-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/18/2/59 |
_version_ | 1828151179786321920 |
---|---|
author | Bao-Gang Hu Hong-Jie Xing |
author_facet | Bao-Gang Hu Hong-Jie Xing |
author_sort | Bao-Gang Hu |
collection | DOAJ |
description | In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers result in non-Bayesian solutions. For both types of errors, we derive the closed-form relations between each bound and error components. When Fano’s lower bound in a diagram of “Error Probability vs. Conditional Entropy” is realized based on the approach, its interpretations are enlarged by including non-Bayesian errors and the two situations along with independent properties of the variables. A new upper bound for the Bayesian error is derived with respect to the minimum prior probability, which is generally tighter than Kovalevskij’s upper bound. |
first_indexed | 2024-04-11T21:54:04Z |
format | Article |
id | doaj.art-6e7ad3a2d6e4486a9f6edb46bac6a92d |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-11T21:54:04Z |
publishDate | 2016-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-6e7ad3a2d6e4486a9f6edb46bac6a92d2022-12-22T04:01:10ZengMDPI AGEntropy1099-43002016-02-011825910.3390/e18020059e18020059An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary ClassificationsBao-Gang Hu0Hong-Jie Xing1NLPR/LIAMA, Institute of Automation, Chinese Academy of Science, Beijing 100190, ChinaCollege of Mathematics and Information Science, Hebei University, Baoding 071002, ChinaIn this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers result in non-Bayesian solutions. For both types of errors, we derive the closed-form relations between each bound and error components. When Fano’s lower bound in a diagram of “Error Probability vs. Conditional Entropy” is realized based on the approach, its interpretations are enlarged by including non-Bayesian errors and the two situations along with independent properties of the variables. A new upper bound for the Bayesian error is derived with respect to the minimum prior probability, which is generally tighter than Kovalevskij’s upper bound.http://www.mdpi.com/1099-4300/18/2/59entropyerror probabilityBayesian errorserror typesupper boundlower bound |
spellingShingle | Bao-Gang Hu Hong-Jie Xing An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications Entropy entropy error probability Bayesian errors error types upper bound lower bound |
title | An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications |
title_full | An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications |
title_fullStr | An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications |
title_full_unstemmed | An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications |
title_short | An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications |
title_sort | optimization approach of deriving bounds between entropy and error from joint distribution case study for binary classifications |
topic | entropy error probability Bayesian errors error types upper bound lower bound |
url | http://www.mdpi.com/1099-4300/18/2/59 |
work_keys_str_mv | AT baoganghu anoptimizationapproachofderivingboundsbetweenentropyanderrorfromjointdistributioncasestudyforbinaryclassifications AT hongjiexing anoptimizationapproachofderivingboundsbetweenentropyanderrorfromjointdistributioncasestudyforbinaryclassifications AT baoganghu optimizationapproachofderivingboundsbetweenentropyanderrorfromjointdistributioncasestudyforbinaryclassifications AT hongjiexing optimizationapproachofderivingboundsbetweenentropyanderrorfromjointdistributioncasestudyforbinaryclassifications |