PAC-Bayes Unleashed: Generalisation Bounds with Unbounded Losses

We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typical...

Full description

Bibliographic Details
Main Authors: Maxime Haddouche, Benjamin Guedj, Omar Rivasplata, John Shawe-Taylor
Format: Article
Language:English
Published: MDPI AG 2021-10-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/10/1330
Description
Summary:We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typically assumed to take values in the interval [0;1]). In order to relax this classical assumption, we propose to allow the range of the loss to depend on each predictor. This relaxation is captured by our new notion of <i>HYPothesis-dependent rangE</i> (HYPE). Based on this, we derive a novel PAC-Bayesian generalisation bound for unbounded loss functions, and we instantiate it on a linear regression problem. To make our theory usable by the largest audience possible, we include discussions on actual computation, practicality and limitations of our assumptions.
ISSN:1099-4300