UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks

Jeffares, Alan; Guo, Qinghai; Stenetorp, Pontus; Moraitis, Timoleon; (2022) Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks. In: The Tenth International Conference on Learning Representations. Green open access

[thumbnail of spike_inspired_rank_coding_for.pdf]
Preview
Text
spike_inspired_rank_coding_for.pdf - Published Version

Download (8MB) | Preview

Abstract

Biological spiking neural networks (SNNs) can temporally encode information in their outputs, e.g. in the rank order in which neurons fire, whereas artificial neural networks (ANNs) conventionally do not. As a result, models of SNNs for neuromorphic computing are regarded as potentially more rapid and efficient than ANNs when dealing with temporal input. On the other hand, ANNs are simpler to train, and usually achieve superior performance. Here we show that temporal coding such as rank coding (RC) inspired by SNNs can also be applied to conventional ANNs such as LSTMs, and leads to computational savings and speedups. In our RC for ANNs, we apply backpropagation through time using the standard real-valued activations, but only from a strategically early time step of each sequential input example, decided by a threshold-crossing event. Learning then incorporates naturally also _when_ to produce an output, without other changes to the model or the algorithm. Both the forward and the backward training pass can be significantly shortened by skipping the remaining input sequence after that first event. RC-training also significantly reduces time-to-insight during inference, with a minimal decrease in accuracy. The desired speed-accuracy trade-off is tunable by varying the threshold or a regularization parameter that rewards output entropy. We demonstrate these in two toy problems of sequence classification, and in a temporally-encoded MNIST dataset where our RC model achieves 99.19% accuracy after the first input time-step, outperforming the state of the art in temporal coding with SNNs, as well as in spoken-word classification of Google Speech Commands, outperforming non-RC-trained early inference with LSTMs.

Type: Proceedings paper
Title: Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks
Event: ICLR 2022
Open access status: An open access version is available from UCL Discovery
Publisher version: https://openreview.net/forum?id=CUUjknuhkd
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
UCL classification: UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL
URI: https://discovery.ucl.ac.uk/id/eprint/10154330
Downloads since deposit
8Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item