UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Kernel Meta-Learning by Leveraging Natural Data Assumptions

Falk, John Isak Texas; (2023) Kernel Meta-Learning by Leveraging Natural Data Assumptions. Doctoral thesis (Ph.D), UCL (University College London). Green open access

[thumbnail of Main.PhD pdf.pdf]
Preview
Text
Main.PhD pdf.pdf - Other

Download (2MB) | Preview

Abstract

Data representation is integral to meta-learning and is effectively done using kernels. Good performance requires algorithms that can learn kernels (or feature maps) from collections of tasks sampled from a meta-distribution. In this thesis we exploit natural assumptions on the meta-distribution to design meta-kernel learning algorithms, leading to two novel state-of-the-art (SOTA) meta-classification and regression algorithms. The first method, Meta-Label Learning (MeLa) [Wan+22] leverages the meta-classification assumption that each task is generated from a global base dataset by randomly sampling C classes, anonymising the labels, then sampling K instances from each class. Anonymity of task-labels prohibit us from pooling task-instances with the same global class. MeLa recovers, in some cases perfectly, the underlying true classes of all task-instances allowing us to form a standard dataset and train a feature map in a supervised manner. This procedure leads to SOTA performance while being faster and more robust than alternative few-shot learning algorithms. For meta-regression the notion of global classes is not well-defined. In Implicit Kernel Meta-Learning (IKML) [FCP22] we leverage the assumption that the optimal task-regressors belong to an RKHS with a kernel that is translation-invariant. We learn such a kernel from a kernel family characterized by a neural network through a pushforward model using Bochner’s theorem. The model is trained by optimizing the meta-loss with random feature kernel ridge regression as the base algorithm. IKML achieves SOTA on two meta-regression benchmarks while allowing to trade accuracy for speed at test-time. We provide a bound on the excess transfer risk, allowing to specify the least number of random features necessary to achieve optimal generalization performance.

Type: Thesis (Doctoral)
Qualification: Ph.D
Title: Kernel Meta-Learning by Leveraging Natural Data Assumptions
Open access status: An open access version is available from UCL Discovery
Language: English
Additional information: Copyright © The Author 2023. Original content in this thesis is licensed under the terms of the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) Licence (https://creativecommons.org/licenses/by-sa/4.0/). Any third-party copyright material present remains the property of its respective owner(s) and is licensed under its existing terms. Access may initially be restricted at the author’s request.
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science
URI: https://discovery.ucl.ac.uk/id/eprint/10182318
Downloads since deposit
Loading...
0Downloads
Download activity - last month
Loading...
Download activity - last 12 months
Loading...
Downloads by country - last 12 months
Loading...

Archive Staff Only

View Item View Item