Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed ph...
Main Author: | Peter Tiňo |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2013-04-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/15/4/1202 |
Similar Items
-
Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
by: Marco Scutari
Published: (2024-01-01) -
Ranking the Impact of Different Tests on a Hypothesis in a Bayesian Network
by: Leila Schneps, et al.
Published: (2018-11-01) -
Kullback–Leibler Divergence of Sleep-Wake Patterns Related with Depressive Severity in Patients with Epilepsy
by: Mingsu Liu, et al.
Published: (2023-05-01) -
Article Omission in Dutch Children with SLI: A Processing Approach
by: Lizet van Ewijk, et al.
Published: (2010-04-01) -
Posterior Averaging Information Criterion
by: Shouhao Zhou
Published: (2023-03-01)