TY  - GEN
TI  - Comparing Comparators in Generalization Bounds
Y1  - 2024///
AV  - public
N2  - We derive generic information-theoretic and PAC-Bayesian generalization bounds involving an arbitrary convex comparator function, which measures the discrepancy between the training loss and the population loss. The bounds hold under the assumption that the cumulant-generating function (CGF) of the comparator is upper-bounded by the corresponding CGF within a family of bounding distributions. We show that the tightest possible bound is obtained with the comparator being the convex conjugate of the CGF of the bounding distribution, also known as the Cramér function. This conclusion applies more broadly to generalization bounds with a similar structure. This confirms the near-optimality of known bounds for bounded and sub-Gaussian losses and leads to novel bounds under other bounding distributions.
UR  - https://proceedings.mlr.press/v238/hellstrom24a.html
EP  - 81
ID  - discovery10192390
N1  - © The Authors 2024. Original content in this paper is licensed under the terms of the Creative Commons Attribution 4.0 International (CC BY 4.0) Licence (https://creativecommons.org/licenses/by/4.0/).
SP  - 73
A1  - Hellström, Fredrik
A1  - Guedj, Benjamin
PB  - PMLR (Proceedings of Machine Learning Research)
ER  -