Revised loss bounds for the set covering machine and sample-compression loss bounds for imbalanced data.
J MACH LEARN RES
2533 - 2549.
Marchand and Shawe-Taylor (2002) have proposed a loss bound for the set covering machine that has the property to depend on the observed fraction of positive examples and on what the classifier achieves on the positive training examples. We show that this loss bound is incorrect. We then propose a loss bound, valid for any sample-compression learning algorithm (including the set covering machine), that depends on the observed fraction of positive examples and on what the classifier achieves on them. We also compare numerically the loss bound proposed in this paper with the incorrect bound, the original SCM bound and a recently proposed loss bound of Marchand and Sokolova (2005) (which does not depend on the observed fraction of positive examples) and show that the latter loss bounds can be substantially larger than the new bound in the presence of imbalanced misclassifications.
|Title:||Revised loss bounds for the set covering machine and sample-compression loss bounds for imbalanced data|
|Open access status:||An open access publication|
|Keywords:||set covering machines, sample-compression, loss bounds|
|UCL classification:||UCL > School of BEAMS > Faculty of Engineering Science > Computer Science|
Archive Staff Only