Guessing with Distributed Encoders

Two correlated sources emit a pair of sequences, each of which is observed by a different encoder. Each encoder produces a rate-limited description of the sequence it observes, and the two descriptions are presented to a guessing device that repeatedly produces sequence pairs until correct. The numb...

Full description

Bibliographic Details
Main Authors: Annina Bracher, Amos Lapidoth, Christoph Pfister
Format: Article
Language:English
Published: MDPI AG 2019-03-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/21/3/298
_version_ 1828349866408935424
author Annina Bracher
Amos Lapidoth
Christoph Pfister
author_facet Annina Bracher
Amos Lapidoth
Christoph Pfister
author_sort Annina Bracher
collection DOAJ
description Two correlated sources emit a pair of sequences, each of which is observed by a different encoder. Each encoder produces a rate-limited description of the sequence it observes, and the two descriptions are presented to a guessing device that repeatedly produces sequence pairs until correct. The number of guesses until correct is random, and it is required that it have a moment (of some prespecified order) that tends to one as the length of the sequences tends to infinity. The description rate pairs that allow this are characterized in terms of the Rényi entropy and the Arimoto–Rényi conditional entropy of the joint law of the sources. This solves the guessing analog of the Slepian–Wolf distributed source-coding problem. The achievability is based on random binning, which is analyzed using a technique by Rosenthal.
first_indexed 2024-04-14T01:15:49Z
format Article
id doaj.art-650e5e2de7c6480cacf83c5a341df45d
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-14T01:15:49Z
publishDate 2019-03-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-650e5e2de7c6480cacf83c5a341df45d2022-12-22T02:20:51ZengMDPI AGEntropy1099-43002019-03-0121329810.3390/e21030298e21030298Guessing with Distributed EncodersAnnina Bracher0Amos Lapidoth1Christoph Pfister2P&C Solutions, Swiss Re, 8022 Zurich, SwitzerlandSignal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, SwitzerlandSignal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, SwitzerlandTwo correlated sources emit a pair of sequences, each of which is observed by a different encoder. Each encoder produces a rate-limited description of the sequence it observes, and the two descriptions are presented to a guessing device that repeatedly produces sequence pairs until correct. The number of guesses until correct is random, and it is required that it have a moment (of some prespecified order) that tends to one as the length of the sequences tends to infinity. The description rate pairs that allow this are characterized in terms of the Rényi entropy and the Arimoto–Rényi conditional entropy of the joint law of the sources. This solves the guessing analog of the Slepian–Wolf distributed source-coding problem. The achievability is based on random binning, which is analyzed using a technique by Rosenthal.http://www.mdpi.com/1099-4300/21/3/298Arimoto–Rényi conditional entropydistributed source codingguessingRényi entropy
spellingShingle Annina Bracher
Amos Lapidoth
Christoph Pfister
Guessing with Distributed Encoders
Entropy
Arimoto–Rényi conditional entropy
distributed source coding
guessing
Rényi entropy
title Guessing with Distributed Encoders
title_full Guessing with Distributed Encoders
title_fullStr Guessing with Distributed Encoders
title_full_unstemmed Guessing with Distributed Encoders
title_short Guessing with Distributed Encoders
title_sort guessing with distributed encoders
topic Arimoto–Rényi conditional entropy
distributed source coding
guessing
Rényi entropy
url http://www.mdpi.com/1099-4300/21/3/298
work_keys_str_mv AT anninabracher guessingwithdistributedencoders
AT amoslapidoth guessingwithdistributedencoders
AT christophpfister guessingwithdistributedencoders