Improved Guarantees for Learning GMMs
Mixtures of Gaussians (GMMs) are one of the most commonly used statistical models. They are typically used to model data coming from two or more heterogenous sources and have applications in a wide variety of fields including statistics, biology, physics and computer science. A fundamental task at t...
Main Author: | Liu, Allen |
---|---|
Other Authors: | Moitra, Ankur |
Format: | Thesis |
Published: |
Massachusetts Institute of Technology
2023
|
Online Access: | https://hdl.handle.net/1721.1/147317 |
Similar Items
-
Improving RSS−Based Ranging in LOS−NLOS Scenario Using GMMs
by: Wang, Q, et al.
Published: (2011) -
Guaranteed hierarchical reinforcement learning
by: Ang, Riley Xile
Published: (2024) -
Safety and robustness for deep learning with provable guarantees (keynote)
by: Kwiatkowska, M
Published: (2019) -
Generative Modeling with Guarantees
by: Quach, Victor
Published: (2023) -
Privacy with Estimation Guarantees
by: Wang, Hao, et al.
Published: (2021)