Soft Quantization Using Entropic Regularization

The quantization problem aims to find the best possible approximation of probability measures on <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi mathvariant="double-struck">R</mi>...

Full description

Bibliographic Details
Main Authors: Rajmadan Lakshmanan, Alois Pichler
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/25/10/1435
Description
Summary:The quantization problem aims to find the best possible approximation of probability measures on <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msup><mi mathvariant="double-struck">R</mi><mi>d</mi></msup></semantics></math></inline-formula> using finite and discrete measures. The Wasserstein distance is a typical choice to measure the quality of the approximation. This contribution investigates the properties and robustness of the entropy-regularized quantization problem, which relaxes the standard quantization problem. The proposed approximation technique naturally adopts the softmin function, which is well known for its robustness from both theoretical and practicability standpoints. Moreover, we use the entropy-regularized Wasserstein distance to evaluate the quality of the soft quantization problem’s approximation, and we implement a stochastic gradient approach to achieve the optimal solutions. The control parameter in our proposed method allows for the adjustment of the optimization problem’s difficulty level, providing significant advantages when dealing with exceptionally challenging problems of interest. As well, this contribution empirically illustrates the performance of the method in various expositions.
ISSN:1099-4300