%0 Journal Article
%@ 1532-4435
%A Sun, S
%A Shawe-Taylor, J
%D 2010
%F discovery:399130
%J Journal of Machine Learning Research
%K semi-supervised learning, Fenchel-Legendre conjugate, representer theorem, multiview regularization, support vector machine, statistical learning theory
%P 2423 - 2455
%T Sparse Semi-supervised Learning Using Conjugate Functions
%U https://discovery.ucl.ac.uk/id/eprint/399130/
%V 11
%X In this paper, we propose a general framework for sparse semi-supervised learning, which concerns  using a small portion of unlabeled data and a few labeled data to represent target functions and thus  has the merit of accelerating function evaluations when predicting the output of a new example.  This framework makes use of Fenchel-Legendre conjugates to rewrite a convex insensitive loss  involving a regularization with unlabeled data, and is applicable to a family of semi-supervised  learning methods such as multi-view co-regularized least squares and single-view Laplacian support  vector machines (SVMs). As an instantiation of this framework, we propose sparse multi-view  SVMs which use a squared ε-insensitive loss. The resultant optimization is an inf-sup problem and  the optimal solutions have arguably saddle-point properties. We present a globally optimal iterative  algorithm to optimize the problem. We give the margin bound on the generalization error of the  sparse multi-view SVMs, and derive the empirical Rademacher complexity for the induced function  class. Experiments on artificial and real-world data show their effectiveness. We further give a  sequential training approach to show their possibility and potential for uses in large-scale problems  and provide encouraging experimental results indicating the efficacy of the margin bound and empirical  Rademacher complexity on characterizing the roles of unlabeled data for semi-supervised  learning
%Z Copyright © 2010 Shiliang Sun and John Shawe-Taylor.