Tight bounds for the expected risk of linear classifiers and PAC-bayes finite-sample guarantees
We analyze the expected risk of linear classifiers for a fixed weight vector in the “minimax” setting. That is, we analyze the worst-case risk among all data distributions with a given mean and covariance. We provide a simpler proof of the tight polynomial-tail bound for general random variables. Fo...
Main Authors: | Honorio Carrillo, Jean, Jaakkola, Tommi S. |
---|---|
Other Authors: | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
Format: | Article |
Language: | en_US |
Published: |
Journal of Machine Learning Research
2018
|
Online Access: | http://hdl.handle.net/1721.1/113045 https://orcid.org/0000-0003-0238-6384 https://orcid.org/0000-0002-2199-0379 |
Similar Items
-
Two-sided exponential concentration bounds for Bayes error rate and Shannon entropy
by: Honorio, Jean, et al.
Published: (2015) -
Tight certificates of adversarial robustness for randomly smoothed classifiers
by: Lee, Guang-He, et al.
Published: (2021) -
PAC-Bayes Unleashed: Generalisation Bounds with Unbounded Losses
by: Maxime Haddouche, et al.
Published: (2021-10-01) -
PAC-Bayes Bounds on Variational Tempered Posteriors for Markov Models
by: Imon Banerjee, et al.
Published: (2021-03-01) -
Inverse Covariance Estimation for High-Dimensional Data in Linear Time and Space: Spectral Methods for Riccati and Sparse Models
by: Honorio, Jean, et al.
Published: (2014)