UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Learning via Wasserstein-Based High Probability Generalisation Bounds

Viallard, Paul; Haddouche, Maxime; Simsekli, Umut; Guedj, Benjamin; (2023) Learning via Wasserstein-Based High Probability Generalisation Bounds. In: Oh, Alice and Naumann, Tristan and Globerson, Amir and Saenko, Kate and Hardt, Moritz and Levine, Sergey, (eds.) Advances in Neural Information Processing Systems (NeurIPS 2023). NeurIPS Green open access

[thumbnail of 69_learning_via_wasserstein_based.pdf]
Preview
Text
69_learning_via_wasserstein_based.pdf

Download (476kB) | Preview

Abstract

Minimising upper bounds on the population risk or the generalisation gap has been widely used in structural risk minimisation (SRM) -- this is in particular at the core of PAC-Bayesian learning. Despite its successes and unfailing surge of interest in recent years, a limitation of the PAC-Bayesian framework is that most bounds involve a Kullback-Leibler (KL) divergence term (or its variations), which might exhibit erratic behavior and fail to capture the underlying geometric structure of the learning problem -- hence restricting its use in practical applications.As a remedy, recent studies have attempted to replace the KL divergence in the PAC-Bayesian bounds with the Wasserstein distance. Even though these bounds alleviated the aforementioned issues to a certain extent, they either hold in expectation, are for bounded losses, or are nontrivial to minimize in an SRM framework. In this work, we contribute to this line of research and prove novel Wasserstein distance-based PAC-Bayesian generalisation bounds for both batch learning with independent and identically distributed (i.i.d.) data, and online learning with potentially non-i.i.d. data. Contrary to previous art, our bounds are stronger in the sense that (i) they hold with high probability, (ii) they apply to unbounded (potentially heavy-tailed) losses, and (iii) they lead to optimizable training objectives that can be used in SRM. As a result we derive novel Wasserstein-based PAC-Bayesian learning algorithms and we illustrate their empirical advantage on a variety of experiments.

Type: Proceedings paper
Title: Learning via Wasserstein-Based High Probability Generalisation Bounds
Event: 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Open access status: An open access version is available from UCL Discovery
Publisher version: https://papers.nips.cc/paper_files/paper/2023/hash...
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher's terms and conditions.
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science
URI: https://discovery.ucl.ac.uk/id/eprint/10188665
Downloads since deposit
1Download
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item