Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction

The demands for machine learning and knowledge extraction methods have been booming due to the unprecedented surge in data volume and data quality. Nevertheless, challenges arise amid the emerging data complexity as significant chunks of information and knowledge lie within the non-ordinal realm of...

Full description

Bibliographic Details
Main Author: Jialin Zhang
Format: Article
Language:English
Published: MDPI AG 2022-09-01
Series:Machine Learning and Knowledge Extraction
Subjects:
Online Access:https://www.mdpi.com/2504-4990/4/4/44
_version_ 1797456598087499776
author Jialin Zhang
author_facet Jialin Zhang
author_sort Jialin Zhang
collection DOAJ
description The demands for machine learning and knowledge extraction methods have been booming due to the unprecedented surge in data volume and data quality. Nevertheless, challenges arise amid the emerging data complexity as significant chunks of information and knowledge lie within the non-ordinal realm of data. To address the challenges, researchers developed considerable machine learning and knowledge extraction methods regarding various domain-specific challenges. To characterize and extract information from non-ordinal data, all the developed methods pointed to the subject of Information Theory, established following Shannon’s landmark paper in 1948. This article reviews recent developments in entropic statistics, including estimation of Shannon’s entropy and its functionals (such as mutual information and Kullback–Leibler divergence), concepts of entropic basis, generalized Shannon’s entropy (and its functionals), and their estimations and potential applications in machine learning and knowledge extraction. With the knowledge of recent development in entropic statistics, researchers can customize existing machine learning and knowledge extraction methods for better performance or develop new approaches to address emerging domain-specific challenges.
first_indexed 2024-03-09T16:10:06Z
format Article
id doaj.art-80fef4768e824b4ca3ae999493a4981a
institution Directory Open Access Journal
issn 2504-4990
language English
last_indexed 2024-03-09T16:10:06Z
publishDate 2022-09-01
publisher MDPI AG
record_format Article
series Machine Learning and Knowledge Extraction
spelling doaj.art-80fef4768e824b4ca3ae999493a4981a2023-11-24T16:18:55ZengMDPI AGMachine Learning and Knowledge Extraction2504-49902022-09-014486588710.3390/make4040044Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge ExtractionJialin Zhang0Department of Mathematics and Statistics, Mississippi State University, Mississippi State, MS 39762, USAThe demands for machine learning and knowledge extraction methods have been booming due to the unprecedented surge in data volume and data quality. Nevertheless, challenges arise amid the emerging data complexity as significant chunks of information and knowledge lie within the non-ordinal realm of data. To address the challenges, researchers developed considerable machine learning and knowledge extraction methods regarding various domain-specific challenges. To characterize and extract information from non-ordinal data, all the developed methods pointed to the subject of Information Theory, established following Shannon’s landmark paper in 1948. This article reviews recent developments in entropic statistics, including estimation of Shannon’s entropy and its functionals (such as mutual information and Kullback–Leibler divergence), concepts of entropic basis, generalized Shannon’s entropy (and its functionals), and their estimations and potential applications in machine learning and knowledge extraction. With the knowledge of recent development in entropic statistics, researchers can customize existing machine learning and knowledge extraction methods for better performance or develop new approaches to address emerging domain-specific challenges.https://www.mdpi.com/2504-4990/4/4/44discrete datanon-ordinal datanon-parametric estimationentropic statisticsinformation-theoretic quantity
spellingShingle Jialin Zhang
Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
Machine Learning and Knowledge Extraction
discrete data
non-ordinal data
non-parametric estimation
entropic statistics
information-theoretic quantity
title Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
title_full Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
title_fullStr Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
title_full_unstemmed Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
title_short Entropic Statistics: Concept, Estimation, and Application in Machine Learning and Knowledge Extraction
title_sort entropic statistics concept estimation and application in machine learning and knowledge extraction
topic discrete data
non-ordinal data
non-parametric estimation
entropic statistics
information-theoretic quantity
url https://www.mdpi.com/2504-4990/4/4/44
work_keys_str_mv AT jialinzhang entropicstatisticsconceptestimationandapplicationinmachinelearningandknowledgeextraction