Divergence Covering

A longstanding problem of interest is that of finding covering numbers. A very important measure between probability distributions is Kullback-Leibler (KL) divergence. Both topics have been massively studied in various contexts, and in this thesis we focus on studying the problem when the two concep...

Full description

Bibliographic Details
Main Author: Tang, Jennifer
Other Authors: Polyanskiy, Yury
Format: Thesis
Published: Massachusetts Institute of Technology 2022
Online Access:https://hdl.handle.net/1721.1/143246
_version_ 1826212954120388608
author Tang, Jennifer
author2 Polyanskiy, Yury
author_facet Polyanskiy, Yury
Tang, Jennifer
author_sort Tang, Jennifer
collection MIT
description A longstanding problem of interest is that of finding covering numbers. A very important measure between probability distributions is Kullback-Leibler (KL) divergence. Both topics have been massively studied in various contexts, and in this thesis we focus on studying the problem when the two concepts are combined. This combination yields interesting techniques for providing useful bounds on a number of important problems related to information theory. Our goal is to explore covering the probability simplex in terms of KL divergence. Various properties of KL divergence (e.g. it is not a metric, not symmetric, and can easily blow up to infinity) make it unintuitive and difficult to analyze using traditional methods. We look at covering discrete large-alphabet probabilities both with worst-case divergence distance and average-case divergence distance and examine the implications of these divergence covering numbers. One implication of worst-case divergence covering is finding how to communicate probability distributions with limited communication bandwidth. Another implication is in universal compression and universal prediction, where the divergence covering number provides upper bounds on minimax risk. A third application is computing capacity of the noisy permutation channel. We then use average-case divergence covering to study efficient algorithms for quantizing large-alphabet distributions in order to save storage space.
first_indexed 2024-09-23T15:40:47Z
format Thesis
id mit-1721.1/143246
institution Massachusetts Institute of Technology
last_indexed 2024-09-23T15:40:47Z
publishDate 2022
publisher Massachusetts Institute of Technology
record_format dspace
spelling mit-1721.1/1432462022-06-16T03:07:56Z Divergence Covering Tang, Jennifer Polyanskiy, Yury Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science A longstanding problem of interest is that of finding covering numbers. A very important measure between probability distributions is Kullback-Leibler (KL) divergence. Both topics have been massively studied in various contexts, and in this thesis we focus on studying the problem when the two concepts are combined. This combination yields interesting techniques for providing useful bounds on a number of important problems related to information theory. Our goal is to explore covering the probability simplex in terms of KL divergence. Various properties of KL divergence (e.g. it is not a metric, not symmetric, and can easily blow up to infinity) make it unintuitive and difficult to analyze using traditional methods. We look at covering discrete large-alphabet probabilities both with worst-case divergence distance and average-case divergence distance and examine the implications of these divergence covering numbers. One implication of worst-case divergence covering is finding how to communicate probability distributions with limited communication bandwidth. Another implication is in universal compression and universal prediction, where the divergence covering number provides upper bounds on minimax risk. A third application is computing capacity of the noisy permutation channel. We then use average-case divergence covering to study efficient algorithms for quantizing large-alphabet distributions in order to save storage space. Ph.D. 2022-06-15T13:07:03Z 2022-06-15T13:07:03Z 2022-02 2022-03-04T20:47:41.680Z Thesis https://hdl.handle.net/1721.1/143246 In Copyright - Educational Use Permitted Copyright MIT http://rightsstatements.org/page/InC-EDU/1.0/ application/pdf Massachusetts Institute of Technology
spellingShingle Tang, Jennifer
Divergence Covering
title Divergence Covering
title_full Divergence Covering
title_fullStr Divergence Covering
title_full_unstemmed Divergence Covering
title_short Divergence Covering
title_sort divergence covering
url https://hdl.handle.net/1721.1/143246
work_keys_str_mv AT tangjennifer divergencecovering