Towards a Unified Theory of Learning and Information
In this paper, we introduce the notion of “learning capacity” for algorithms that learn from data, which is analogous to the Shannon channel capacity for communication systems. We show how “learning capacity” bridges the gap between statistical learning theory and information theory, and we will use...
Main Author: | Ibrahim Alabdulmohsin |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-04-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/22/4/438 |
Similar Items
-
Entropy and information theory /
by: 212333 Gray, Robert M.
Published: (2011) -
Informed assessments : an introduction to information, entropy and statistics /
by: 289994 Jessop, Alan
Published: (1995) -
Information Theory : Part I: An Introduction to the Fundamental Concepts /
by: Ben-Naim, Arieh, 1934- 571963
Published: (2017) -
Information theory and statistical learning /
by: Emmert-Streib, Frank, et al.
Published: (2009) -
Designing a Novel Approach Using a Greedy and Information-Theoretic Clustering-Based Algorithm for Anonymizing Microdata Sets
by: Reza Ahmadi Khatir, et al.
Published: (2023-12-01)