23–24 Jun 2022
Virtual
Europe/Berlin timezone

Fast and energy-efficient neuromorphic deep learning with first-spike times

K-3
24 Jun 2022, 09:00
45m
Virtual

Virtual

Keynote Main track Keynote

Speakers

Julian Göltz (Kirchhoff-Institute for Physics, Heidelberg University; Department of Physiology, University of Bern) Laura Kriener (Department of Physiology, University of Bern; Kirchhoff-Institute for Physics, Heidelberg University)

Description

It has long been rumored that Deep Learning in its current form is growing infeasible, and increasingly new methods are being explored. One prominent idea is to look at the brain for inspiration because there low energy consumption and fast reaction times are of critical importance. A central aspect in neural processing and also neuromorphic systems is the usage of spikes as a means of communication. However, the discrete and therefore discontinuous nature of spikes long made it difficult to apply optimization algorithms based on differentiable loss functions, which could only be bypassed by reverting to approximative methods.
Our solution for this problem is to operate on spike timings as these are inherently differentiable. We sketch the derivation of an exact learning rule for spike times in networks of leaky integrate-and-fire neurons, implementing error backpropagation in hierarchical spiking networks. Furthermore, we present our implementation on the BrainScaleS-2 neuromorphic system and demonstrate its capability of harnessing the system's speed and energy characteristics. We explicitly address issues that are also relevant for both biological plausibility and applicability to neuromorphic substrates, incorporating dynamics with finite time constants and optimizing the backward pass with respect to substrate variability.
Our approach shows the potential benefits of using spikes to enable fast and energy efficient inference on spiking neuromorphic hardware: on the BrainScaleS-2 chip, we classify the whole MNIST test data set with an energy per classification of 8.4uJ.

Acknowledgements

Our work has greatly benefitted from access to the Fenix Infrastructure resources, which are partially funded from the European Union’s Horizon 2020 research and innovation programme through the ICEI project under the grant agreement No. 800858.
We gratefully acknowledge funding from the European Union under grant agreements 604102, 720270, 785907, 945539 (HBP) and the Manfred Stärk Foundation.

References

  • Göltz, Julian and Kriener, Laura et al. "Fast and energy-efficient neuromorphic deep learning with first-spike times." Nature machine intelligence 3.9 (2021): 823-835.
  • Kriener, Laura, Julian Göltz, and Mihai A. Petrovici. "The yin-yang dataset." Neuro-Inspired Computational Elements Conference. 2022.
Preferred form of presentation Talk & (optional) poster
Topic area simulator technology and performance
Speaker time zone UTC+2
I agree to the copyright and license terms Yes
I agree to the declaration of honor Yes

Primary authors

Julian Göltz (Kirchhoff-Institute for Physics, Heidelberg University; Department of Physiology, University of Bern) Laura Kriener (Department of Physiology, University of Bern; Kirchhoff-Institute for Physics, Heidelberg University)

Co-authors

Andreas Baumbach (Kirchhoff-Institute for Physics, Heidelberg University) Sebastian Billaudelle (Kirchhoff-Institute for Physics, Heidelberg University) Oliver Breitwieser (Kirchhoff-Institute for Physics, Heidelberg University) Benjamin Cramer (Kirchhoff-Institute for Physics, Heidelberg University) Dominik Dold (Kirchhoff-Institute for Physics, Heidelberg University) Akos F. Kungl (Kirchhoff-Institute for Physics, Heidelberg University) Walter Senn (Department of Physiology, University of Bern) Johannes Schemmel (Kirchhoff-Institute for Physics, Heidelberg University) Karlheinz Meier (Kirchhoff-Institute for Physics, Heidelberg University) Mihai A. Petrovici (Kirchhoff-Institute for Physics, Heidelberg University; Department of Physiology, University of Bern)

Presentation materials

There are no materials yet.