17–18 Jun 2025
Virtual
Europe/Berlin timezone

Mitigating Catastrophic Forgetting in Biologically Plausible Learning

P-1
18 Jun 2025, 14:45
2m
Zoom

Zoom

Poster & advertisement flash talk Poster teasers

Speaker

Jesus Andres Espinoza Valverde (School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany)

Description

Eligibility Propagation (e-prop) is a biologically inspired learning algorithm providing a plausible alternative to Backpropagation Through Time (BPTT) [1]. Despite its biological realism, e-prop inherits BPTT's vulnerability to catastrophic forgetting (CF), a phenomenon in which neural networks lose previously acquired knowledge when trained on new tasks.

Elastic Weight Consolidation (EWC) is a prominent regularization-based method known for reducing CF in artificial neural networks [2]. Due to its compatibility with the locality constraints of biological synapses, EWC holds considerable promise for integration into biologically plausible learning algorithms. However, EWC relies on separate phases for task learning and weight importance estimation and necessitates explicit task boundaries, limiting its biological plausibility and practical applicability in real-world scenarios lacking clear task delineations.

In this poster, we propose the integration of EWC principles directly into the e-prop algorithm. Our approach concurrently estimates Fisher Information and weight importance during learning, eliminating the requirement for distinct training and estimation phases. Furthermore, we extend our neural network with a Bayesian inference-based novelty detection layer, enabling it to autonomously identify task transitions without explicit external cues. This detected novelty is signaled to all neurons, facilitating effective parameter consolidation.

We evaluate our proposed method on the Permuted Neuromorphic MNIST dataset and demonstrate substantial reductions in catastrophic forgetting, significantly improving the network's continual learning capabilities without compromising task-specific performance. Our results represent a step forward in developing biologically plausible learning algorithms with enhanced robustness for continual learning.

References

[1] Bellec, Guillaume, Franz Scherr, Anand Subramoney, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass. n.d. “Supplementary Materials for: A Solution to the Learning Dilemma for Recurrent Networks of Spiking Neurons.”
[2] Kirkpatrick, James, Razvan Pascanu, Neil Rabinowitz, Joel Veness, Guillaume Desjardins, Andrei A. Rusu, Kieran Milan, et al. 2017. “Overcoming Catastrophic Forgetting in Neural Networks.” Proceedings of the National Academy of Sciences of the United States of America 114 (13): 3521–26.

Preferred form of presentation Poster & advertising flash talk
Topic area Models and applications
Keywords Continual learning, Catastrophic forgetting
Speaker time zone UTC+1
I agree to the copyright and license terms Yes
I agree to the declaration of honor Yes

Primary author

Jesus Andres Espinoza Valverde (School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany)

Co-author

Mr Matthias Bolten (School of Mathematics and Natural Sciences, Bergische Universität Wuppertal, Wuppertal, Germany)

Presentation materials

There are no materials yet.