UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

When Sparse Neural Network Meets Label Noise Learning: A Multistage Learning Framework

Jiang, Runqing; Yan, Yan; Xue, Jing-Hao; Wang, Biao; Wang, Hanzi; (2022) When Sparse Neural Network Meets Label Noise Learning: A Multistage Learning Framework. IEEE Transactions on Neural Networks and Learning Systems pp. 1-15. 10.1109/tnnls.2022.3188799. (In press). Green open access

[thumbnail of RunqingJiang-TNNLS-sparseNN-labelnoise.pdf]
Preview
Text
RunqingJiang-TNNLS-sparseNN-labelnoise.pdf - Accepted Version

Download (1MB) | Preview

Abstract

Recent methods in network pruning have indicated that a dense neural network involves a sparse subnetwork (called a winning ticket), which can achieve similar test accuracy to its dense counterpart with much fewer network parameters. Generally, these methods search for the winning tickets on well-labeled data. Unfortunately, in many real-world applications, the training data are unavoidably contaminated with noisy labels, thereby leading to performance deterioration of these methods. To address the above-mentioned problem, we propose a novel two-stream sample selection network (TS 3 -Net), which consists of a sparse subnetwork and a dense subnetwork, to effectively identify the winning ticket with noisy labels. The training of TS 3 -Net contains an iterative procedure that switches between training both subnetworks and pruning the smallest magnitude weights of the sparse subnetwork. In particular, we develop a multistage learning framework including a warm-up stage, a semisupervised alternate learning stage, and a label refinement stage, to progressively train the two subnetworks. In this way, the classification capability of the sparse subnetwork can be gradually improved at a high sparsity level. Extensive experimental results on both synthetic and real-world noisy datasets (including MNIST, CIFAR-10, CIFAR-100, ANIMAL-10N, Clothing1M, and WebVision) demonstrate that our proposed method achieves state-of-the-art performance with very small memory consumption for label noise learning. Code is available at https://github.com/Runqing-forMost/TS3-Net/tree/master.

Type: Article
Title: When Sparse Neural Network Meets Label Noise Learning: A Multistage Learning Framework
Open access status: An open access version is available from UCL Discovery
DOI: 10.1109/tnnls.2022.3188799
Publisher version: http://dx.doi.org/10.1109/tnnls.2022.3188799
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Training, Noise measurement, Neural networks, Data models, Computational modeling, Training data, Task analysis
UCL classification: UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Maths and Physical Sciences
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Maths and Physical Sciences > Dept of Statistical Science
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL
URI: https://discovery.ucl.ac.uk/id/eprint/10152149
Downloads since deposit
214Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item