k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies a...
Main Authors: | Michael E. Andrew, Shengqiao Li, Robert M. Mnatsakanov |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2011-03-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/13/3/650/ |
Similar Items
-
Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach
by: Jie Zhu, et al.
Published: (2015-06-01) -
Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
by: Marco Scutari
Published: (2024-01-01) -
An Application of Entropy in Survey Scale
by: Özgül Vupa, et al.
Published: (2009-10-01) -
Entropy, Carnot Cycle, and Information Theory
by: Mario Martinelli
Published: (2018-12-01) -
Clustering Some MicroRNAs Expressed in the Breast Tissue Using Shannon Information Theory and Comparing the Results With UPGMA, Neighbor-Joining, and Maximum-Likelihood Methods
by: Arezo Askari Rad, et al.
Published: (2020-09-01)