Speakers
Description
It has long been rumored that Deep Learning in its current form is growing infeasible, and increasingly new methods are being explored. One prominent idea is to look at the brain for inspiration because there low energy consumption and fast reaction times are of critical importance. A central aspect in neural processing and also neuromorphic systems is the usage of spikes as a means of communication. However, the discrete and therefore discontinuous nature of spikes long made it difficult to apply optimization algorithms based on differentiable loss functions, which could only be bypassed by reverting to approximative methods.
Our solution for this problem is to operate on spike timings as these are inherently differentiable. We sketch the derivation of an exact learning rule for spike times in networks of leaky integrate-and-fire neurons, implementing error backpropagation in hierarchical spiking networks. Furthermore, we present our implementation on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system's speed and energy characteristics. We explicitly address issues that are also relevant for both biological plausibility and applicability to neuromorphic substrates, incorporating dynamics with finite time constants and optimizing the backward pass with respect to substrate variability.
Our approach shows the potential benefits of using spikes to enable fast and energy efficient inference on spiking neuromorphic hardware: on the BrainScaleS-2 chip, we classify the whole MNIST test data set with an energy per classification of 8.4uJ.
Acknowledgements
Our work has greatly benefitted from access to the Fenix Infrastructure resources, which are partially funded from the European Union’s Horizon 2020 research and innovation programme through the ICEI project under the grant agreement No. 800858.
We gratefully acknowledge funding from the European Union under grant agreements 604102, 720270, 785907, 945539 (HBP) and the Manfred Stärk Foundation.
References
- Göltz, Julian and Kriener, Laura et al. "Fast and energy-efficient neuromorphic deep learning with first-spike times." Nature machine intelligence 3.9 (2021): 823-835.
- Kriener, Laura, Julian Göltz, and Mihai A. Petrovici. "The yin-yang dataset." Neuro-Inspired Computational Elements Conference. 2022.
Preferred form of presentation | Talk & (optional) poster |
---|---|
Topic area | simulator technology and performance |
Speaker time zone | UTC+2 |
I agree to the copyright and license terms | Yes |
I agree to the declaration of honor | Yes |