A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions
This paper introduces a closed-form expression for the Kullback–Leibler divergence (KLD) between two central multivariate Cauchy distributions (MCDs) which have been recently used in different signal and image processing applications where non-Gaussian models are needed. In this overview, the MCDs a...
Main Authors: | Nizar Bouhlel, David Rousseau |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/24/6/838 |
Similar Items
-
Learning Kullback-Leibler Divergence-Based Gaussian Model for Multivariate Time Series Classification
by: Gongqing Wu, et al.
Published: (2019-01-01) -
Computation of Kullback–Leibler Divergence in Bayesian Networks
by: Serafín Moral, et al.
Published: (2021-08-01) -
Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions
by: Reinaldo B. Arellano-Valle, et al.
Published: (2012-09-01) -
Kullback–Leibler Divergence of Sleep-Wake Patterns Related with Depressive Severity in Patients with Epilepsy
by: Mingsu Liu, et al.
Published: (2023-05-01) -
Dynamic fine‐tuning layer selection using Kullback–Leibler divergence
by: Raphael Ngigi Wanjiku, et al.
Published: (2023-05-01)