Denevi, G;
Stamos, D;
Ciliberto, C;
Pontil, M;
(2019)
Online-Within-Online Meta-Learning.
In: Wallach, H and Larochelle, H and Beygelzimer, A and d'Alche-Buc, F and Fox, E and Garnett, R, (eds.)
Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS 2019).
(pp. pp. 1-11).
Neural Information Processing Systems (NeurIPS 2019)
Preview |
Text
9468-online-within-online-meta-learning.pdf - Published Version Download (2MB) | Preview |
Abstract
We study the problem of learning a series of tasks in a fully online Meta-Learning setting. The goal is to exploit similarities among the tasks to incrementally adapt an inner online algorithm in order to incur a low averaged cumulative error over the tasks. We focus on a family of inner algorithms based on a parametrized variant of online Mirror Descent. The inner algorithm is incrementally adapted by an online Mirror Descent meta-algorithm using the corresponding within-task minimum regularized empirical risk as the meta-loss. In order to keep the process fully online, we approximate the meta-subgradients by the online inner algorithm. An upper bound on the approximation error allows us to derive a cumulative error bound for the proposed method. Our analysis can also be converted to the statistical setting by online-to-batch arguments. We instantiate two examples of the framework in which the meta-parameter is either a common bias vector or feature map. Finally, preliminary numerical experiments confirm our theoretical findings.
Type: | Proceedings paper |
---|---|
Title: | Online-Within-Online Meta-Learning |
Event: | 33rd Conference on Neural Information Processing Systems (NeurIPS) |
Location: | Vancouver, Canada |
Dates: | 8th-14th December 2019 |
Open access status: | An open access version is available from UCL Discovery |
Publisher version: | https://papers.nips.cc/paper/9468-online-within-on... |
Language: | English |
Additional information: | This version is the version of record. For information on re-use, please refer to the publisher's terms and conditions. |
UCL classification: | UCL UCL > Provost and Vice Provost Offices > UCL BEAMS UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science |
URI: | https://discovery.ucl.ac.uk/id/eprint/10109899 |
Archive Staff Only
View Item |