Information Geometric Approach on Most Informative Boolean Function Conjecture
Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n → { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ;...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-09-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/20/9/688 |
_version_ | 1798042606639251456 |
---|---|
author | Albert No |
author_facet | Albert No |
author_sort | Albert No |
collection | DOAJ |
description | Let X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n → { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ; Y n ) is a dictator function, i.e., f ( x n ) = x i for some i. We propose a clustering problem, which is equivalent to the above problem where we emphasize an information geometry aspect of the equivalent problem. Moreover, we define a normalized geometric mean of measures and interesting properties of it. We also show that the conjecture is true when the arithmetic and geometric mean coincide in a specific set of measures. |
first_indexed | 2024-04-11T22:37:51Z |
format | Article |
id | doaj.art-89b63e0ca75e4f60b97f176d3584f6ac |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-11T22:37:51Z |
publishDate | 2018-09-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-89b63e0ca75e4f60b97f176d3584f6ac2022-12-22T03:59:09ZengMDPI AGEntropy1099-43002018-09-0120968810.3390/e20090688e20090688Information Geometric Approach on Most Informative Boolean Function ConjectureAlbert No0Department of Electronical and Electrical Engineering, Hongik University, Seoul 04066, KoreaLet X n be a memoryless uniform Bernoulli source and Y n be the output of it through a binary symmetric channel. Courtade and Kumar conjectured that the Boolean function f : { 0 , 1 } n → { 0 , 1 } that maximizes the mutual information I ( f ( X n ) ; Y n ) is a dictator function, i.e., f ( x n ) = x i for some i. We propose a clustering problem, which is equivalent to the above problem where we emphasize an information geometry aspect of the equivalent problem. Moreover, we define a normalized geometric mean of measures and interesting properties of it. We also show that the conjecture is true when the arithmetic and geometric mean coincide in a specific set of measures.http://www.mdpi.com/1099-4300/20/9/688Boolean functionBregman divergenceclusteringgeometric meanJensen–Shannon divergence |
spellingShingle | Albert No Information Geometric Approach on Most Informative Boolean Function Conjecture Entropy Boolean function Bregman divergence clustering geometric mean Jensen–Shannon divergence |
title | Information Geometric Approach on Most Informative Boolean Function Conjecture |
title_full | Information Geometric Approach on Most Informative Boolean Function Conjecture |
title_fullStr | Information Geometric Approach on Most Informative Boolean Function Conjecture |
title_full_unstemmed | Information Geometric Approach on Most Informative Boolean Function Conjecture |
title_short | Information Geometric Approach on Most Informative Boolean Function Conjecture |
title_sort | information geometric approach on most informative boolean function conjecture |
topic | Boolean function Bregman divergence clustering geometric mean Jensen–Shannon divergence |
url | http://www.mdpi.com/1099-4300/20/9/688 |
work_keys_str_mv | AT albertno informationgeometricapproachonmostinformativebooleanfunctionconjecture |