UCL logo

UCL Discovery

UCL home » Library Services » Electronic resources » UCL Discovery

Improved loss bounds for multiple kernel learning

Hussain, Z; Shawe-Taylor, J; (2011) Improved loss bounds for multiple kernel learning. Journal of Machine Learning Research , 15 pp. 370-377. Gold open access


We propose two new generalization error bounds for multiple kernel learning (MKL). First, using the bound of Srebro and Ben-David (2006) as a starting point, we derive a new version which uses a simple counting argument for the choice of kernels in order to generate a tighter bound when 1-norm regularization (sparsity) is imposed in the kernel learning problem. The second bound is a Rademacher complexity bound which is additive in the (logarithmic) kernel complexity and margin term. This dependence is superior to all previously published Rademacher bounds for learning a convex combination of kernels, including the recent bound of Cortes et al. (2010), which exhibits a multiplicative interaction. We illustrate the tightness of our bounds with simulations. Copyright 2011 by the authors.

Type: Article
Title: Improved loss bounds for multiple kernel learning
Open access status: An open access publication
UCL classification: UCL > School of BEAMS
UCL > School of BEAMS > Faculty of Engineering Science
URI: http://discovery.ucl.ac.uk/id/eprint/1366191
Downloads since deposit
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item