Amplifying Inter-Message Distance: On Information Divergence Measures in Big Data
Message identification (M-I) divergence is an important measure of the information distance between probability distributions, similar to Kullback-Leibler (K-L) and Renyi divergence. In fact, M-I divergence with a variable parameter can make an effect on characterization of distinction between two d...
Main Authors: | Rui She, Shanyun Liu, Pingyi Fan |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2017-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8090523/ |
Similar Items
-
Divergence Measures Estimation and Its Asymptotic Normality Theory Using Wavelets Empirical Processes III
by: Amadou Diadié Bâ, et al. -
On Accuracy of PDF Divergence Estimators and Their Applicability to Representative Data Sampling
by: Katarzyna Musial, et al.
Published: (2011-07-01) -
Principles of Bayesian Inference Using General Divergence Criteria
by: Jack Jewson, et al.
Published: (2018-06-01) -
Empirical Squared Hellinger Distance Estimator and Generalizations to a Family of <i>α</i>-Divergence Estimators
by: Rui Ding, et al.
Published: (2023-04-01) -
Differential Message Importance Measure: A New Approach to the Required Sampling Number in Big Data Structure Characterization
by: Shanyun Liu, et al.
Published: (2018-01-01)