NEST Conference 2023

Europe/Berlin
Virtual

Virtual

Description

The NEST Initiative is excited to invite everyone interested in Neural Simulation Technology and the NEST Simulator to the virtual NEST Conference 2023. The NEST Conference provides an opportunity for the NEST Community to meet, exchange success stories, swap advice, learn about current developments in and around NEST spiking network simulation and its application. We particularly encourage young scientists to participate in the conference!

The virtual NEST Conference 2023

The NEST Conference 2023 will again be held as a virtual event on

Thursday/Friday 15/16 June.

 

Call for Contributions and registration is closed!

 

Programme

Keynotes

Emre Neftci - Learning with Brain-Inspired Computers

Alice Geminiani - Simulating the neural bases of pathological behaviors with NEST: a use case on dystonia

Claire Wyatt - The Research Software Engineering Community Initiative

Talks

Behnam Ghazinouri - Navigation and the Efficiency of Spatial Coding: Insights from Closed-Loop Simulations

Renan Shimoura - Visual alpha generators in a full-density spiking thalamocortical model

Nicolai Haug - NEST-SONATA: Fast parallel instantiation of explicitly specified large-scale neuronal network models

Jessica Mitchell - NEST Community Services

Carlo Andrea Sartori - Nitric Oxide Diffusive Plasticity Model in Cerebellar SNN

Francesco De Santis - A computational model of the mammalian brainstem to solve sound localization

Jose Villamar - Accelerating Neuronal Network Construction through Dynamic GPU Memory Instantiation

Renan Shimoura - Stochastic neuron model implementation in NEST using NESTML

Sebastian Spreizer - NEST Desktop: What are next steps?

Workshops

Brent Huisman & Thorsten Hater - Arbor: when you really need compartments

Charl Linssen & Pooja Babu - Modeling dopamine-modulated STDP synapse with NESTML

  • Thursday 15 June
    • 09:45 10:00
      Registration Zoom

      Zoom

    • 10:00 10:15
      Welcome & Introduction Zoom

      Zoom

      Convener: Hans Ekkehard Plesser (Norwegian University of Life Sciences)
    • 10:15 11:00
      Keynote: Emre Neftci Zoom

      Zoom

      • 10:15
        Learning with Brain-Inspired Computers 45m

        Continual learning at the edge is an aspirational goal of AI technologies. Neuromorphic hardware that implements large Spiking Neural Networks (SNN) is particularly attractive in this regard, thanks to its inherently local computational paradigm and its potential compatibility with future and emerging computing devices.
        This talk will first overview of current methods for learning in SNN using gradient-based methods, which can achieve competitive accuracy and performance compared to Deep Neural Networks (DNNs). The resulting learning algorithms can be implemented as local synaptic plasticity rules. However, similar to DNNs, these are based on data-intensive and iterative training processes that are incompatible with the realities of neuromorphic hardware. I will argue that gradient-based meta-learning (learning-to-learn) can play a key role in closing this gap by enabling accurate and fast learning that is robust to hardware non-idealities. These results bring neuromorphic engineering several steps closer to building intelligent agents that can continuously adapt to their environment in real time.

        Speaker: Emre Neftci (PGI-15, Forschungszentrum Jülich, Germany)
    • 11:00 11:15
      Group photo & short break

    • 11:15 11:55
      Talks: Renan Shimoura, Francesco De Santis Zoom

      Zoom

      • 11:15
        Stochastic neuron model implementation in NEST using NESTML 20m

        Neurons exhibit intrinsic sources of stochasticity which impact on their spiking behavior. These sources include fluctuations in ion channel gating and diffusion, as well as stochastic release of neurotransmitters. As a result, even when responding to identical inputs, there can be significant variability in spike timing.
        The Galves–Löcherbach (GL) model [1] is a stochastic neuron model that was proposed to capture the effect of these sources of intrinsic noise on neuronal spiking activity. It models the neuron as a stochastic point process with spiking probability that depends on its membrane potential. After a spike, the neuron's membrane potential is instantaneously reset to 0.
        In this work, we present an implementation of the GL model in NEST simulator [2] using the domain specific language NESTML [3]. Additionally, we implemented a version with short-term plasticity dependent on residual calcium [4]. In this case, when a neuron spikes the residual calcium concentration within the cell increases by one unit, and a postsynaptic potential is given that depends linearly on the spiking neuron's calcium concentration. Between successive spikes, the membrane potential and calcium concentration of the neuron decrease at a constant rate.
        Further simulations are necessary to validate the proof of concept implementation with respect to theory and detailed benchmarking and optimisation. On the theoretical side the NESTML specification facilitates the comparison to other stochastic neuron models. The implementation of the GL model in NEST will provide researchers with a powerful tool for investigating the large-scale spiking neural networks dynamics in an efficient manner.

        Speaker: Renan O. Shimoura (INM-6/IAS-6/INM-10, Forschungszentrum Jülich, Germany)
      • 11:35
        A computational model of the mammalian brainstem to solve sound localization 20m

        Implementing bioinspired neural networks in silico is a powerful tool for studying brain processes. These networks grant access to the real-time behavior of individual neurons within a complex circuitry, such as the ones executing neurosensory functions.

        This contriubtion proposes a computational model to study how the mammalian brainstem implements sound localization: the ability to identify an acoustic source in the surrounding space. The main actors in sound localization are two brainstem nuclei: the medial and the lateral superior olive. We have reconstructed a model made of thousands of spiking neurons tailored to the auditory brainstem circuitry and its tonotopic organization.

        The major inputs of our model are two acoustic information intrinsically linked to the position of a sound source in space, the interaural time difference (ITD) and level difference (ILD). Respectively, they consist of the disparity in the arrival time and in the intensity of sound between the right and the left ear.

        With such a realistic model, we tested the latest neuroscience theories on how these two brainstem nuclei exploit these binaural cues to create an auditory map in the brain.
        Eventually, we shed light on the dual pathway that, thanks to its redundancy, improves the precision and reliability of sound source identification.

        Speaker: Francesco De Santis (Politecnico di Milano, Italy)
    • 11:55 12:40
      Lunch Break 45m
    • 12:40 13:20
      Talks: Carlo Andrea Sartori, Nicolai Haug Zoom

      Zoom

      • 12:40
        Nitric Oxide Diffusive Plasticity Model in Cerebellar SNN 20m

        Nitric Oxide (NO) is an essential molecule involved in the synaptic plasticity of many areas of the brain and in neurovascular coupling. NO is known to be present in the cerebellum, both in the Granular and the Molecular layers and it is thought to have an enabling function in plasticity mechanisms. NO plasticity dependency has been investigated mainly in experimental studies, and few mathematical models replicate its function on simple networks, but it has not been included in in silico simulations of large spiking neural networks (SNN).
        In this project, we aim to create a Python module for simulating NO diffusion and integrate it in a NEST simulation of a cerebellar micro-circuit to test its effect on plasticity between parallel fibres (pf) and Purkinje cells (PC). In each pf-PC synapse, we place sources of NO that receive stimuli from the pf and produce NO accordingly. NO diffusion is then simulated in the network, and its concentration is evaluated in each synapse. We then modified the STDP learning rule at the pf-PC synapse level, by implementing a dependency on the [NO] concentration values, and we assessed the spatiality and the effects of the plasticity enabling, following different stimulation protocols.
        This computational model presents itself as a useful and simple tool to simulate the functional role of Nitric Oxide in Neural Networks and its functional role.

        Speaker: Carlo Andrea Sartori (Politecnico di Milano, Italy)
      • 13:00
        NEST-SONATA: Fast parallel instantiation of explicitly specified large-scale neuronal network models 20m

        Simulating brain-scale models requires parallel computers to provide enough memory to represent network connectivity and efficient instantiation of complex network connectivity on massively parallel computers. While scalable data structures and algorithms for storing and accessing connections in parallel are available [1-3], efficient parallel instantiation of such networks has received less attention. Network connectivity can be defined either rule-based [4] or through explicit tabulation of connections, e.g., using the SONATA format [5]. Even for models of limited size and complexity, such as a model of the mouse cortex with more than 9 million point neurons connected by 25 billion synapses, SONATA specification files comprise nearly 500 GB of data in mostly binary format (HDF5). We present here an implementation of direct support for efficient instantiation of networks from SONATA specifications in the NEST simulator [6] as a result of the HBP NEST-SONATA infrastructure voucher.

        Speaker: Nicolai Haug (Norwegian University of Life Sciences)
    • 13:20 14:05
      Keynote: Alice Geminiani Zoom

      Zoom

      • 13:20
        Simulating the neural bases of pathological behaviors with NEST: a use case on dystonia 45m

        Understanding neural bases of behaviors is fundamental in neuroscience, particularly for brain diseases/disorders. Data-driven models of brain circuits are being developed to simulate neural activity and the resulting behaviors. Applying localized lesions models, it is possible to study in silico altered neural activity and behaviors, providing insights into neural bases of pathological conditions [1].
        We here applied this approach to study the role of cerebellar alterations in dystonia.
        Dystonia is a movement disorder, traditionally associated with basal ganglia dysfunction. Recent animal studies suggest a role of the cerebellum, a key brain area for motor control [2], but the causal mechanisms remain unclear. To address this issue, we used a data-driven cerebellar spiking neural network and simulated a cerebellum-driven behavior, Eye-Blink Classical Conditioning (EBCC) [3], which is impaired in some types of dystonia. The model, implemented in NEST [4], include about 10,000 neurons and 1,5 million connections, with parameters tuned on neural data [5]. Through supervised plasticity triggered by inputs, the model was able to reproduce physiological EBCC learning curves. We then modified local features in the network reproducing alterations in dystonic mice [6–8]. Simulations suggest that only certain types of lesions, namely reduced olivocerebellar input and aberrant PC burst-firing, but not imbalance of excitatory-inhibitory input on PCs, are compatible with EBCC changes observed in dystonia, indicating which cerebellar lesions can have a role in generating symptoms.
        Overall, we here provide a tool for studying cerebellum alterations in dystonia, paving the way to in silico investigation of brain diseases using NEST.

        Speaker: Alice Geminiani (University of Pavia, Italy)
    • 14:05 14:15
      Short break 10m
    • 14:15 15:00
      Mingle Gathertown

      Gathertown

    • 15:00 16:30
      Workshop: Charl Linssen & Pooja Babu Zoom

      Zoom

      • 15:00
        Modeling dopamine-modulated STDP synapse with NESTML 1h 30m

        NESTML is a domain-specific modeling language for neuron models and synaptic plasticity rules [1]. It is designed to support researchers in computational neuroscience by allowing them to specify models in a precise and intuitive way. These models can subsequently be used in dynamical simulations of small or large-scale spiking neural networks, by means of high performance simulation code generated by the NESTML toolchain. NESTML features a concise yet expressive syntax, inspired by Python, making it easy to write, understand, maintain and share models. There is direct language support for (spike) events, differential equations, convolutions, stochasticity, and arbitrary algorithms using imperative programming concepts, in addition to flexible event management using handler functions and prioritization.

        In this workshop, we will model a plastic synapse in NESTML, which exhibits spike-timing dependent plasticity (STDP) that is additionally regulated by a neuromodulator. The synapse model will be used to create a network with balanced excitation and inhibition, where by identifying the neuromodulator with dopamine, we implement a biologically realistic version of reinforcement learning. This will be done by interacting with NESTML through a Jupyter notebook, where the model is created and the NESTML toolchain generates the corresponding code for NEST Simulator, making use of the scalable volume transmitter implemented in NEST [2].

        Speakers: Charl Linssen (JSC, Forschungszentrum Jülich, Germany), Ms Pooja Babu (JSC, Forschungszentrum Jülich, Germany)
    • 09:00 09:35
      Keynote: Claire Wyatt Zoom

      Zoom

      • 09:00
        The Research Software Engineering Community Initiative 35m

        Did you start off as a researcher and now spend time developing software to progress your research? Or maybe you started off from a more conventional software-development background and are drawn to research by the challenge of using software to further research?

        A growing number of people in academia combine expertise in programming with an intricate understanding of research, all while being a researcher. Although this combination of skills is extremely valuable, these contributions lack a formal place in the academic system. There is no easy way to recognise their contribution, to reward them, or to represent their views.

        In March 2012, the term Research Software Engineering was first coined at a workshop organised by the Software Sustainability Institute in the UK.

        As the realisation emerged that the lack of awareness, recognition and reward of the skills and contribution or a recognised job title for research developers was having a knock on effect making many activities difficult. Those in the role spoke of how it was difficult to recruit and difficult for developers to find a job. This gave a focus for support and the RSE movement around the world is working to raise awareness of the role, bring RSEs together and advocate for more appropriate career recognition and promotion.

        This talk will take a look at the developing field of Research Software Engineering, the achievements to date around the world, the goals and some of the activities coming up in 2023.

        Speaker: Claire Wyatt (JSC, Forschungszentrum Jülich, Germany)
    • 09:35 10:15
      Talks: Renan Shimoura, Sebastian Spreizer Zoom

      Zoom

      • 09:35
        Visual alpha generators in a full-density spiking thalamocortical model 20m

        The alpha rhythm (~10 Hz) is a prominent feature in the electroencephalograms of various mammals and is associated with reduced visual attention and with functions such as timing regulation and transmission facilitation [1]. Although the exact mechanism of alpha rhythm generation is still unclear, the thalamus and cortex have been proposed as possible protagonists. In this study, a full-density spiking thalamocortical model of neural circuits in the primary visual cortex and the lateral geniculate nucleus was built using the NEST simulator, to investigate two potential alpha rhythm generators. The first mechanism involves rhythmic bursts produced by pyramidal neurons in layer 5 at around 10 Hz [2], while the second mechanism relies on a thalamocortical loop delay of approximately 100 ms [3]. The model comprises excitatory and inhibitory populations of adaptive exponential integrate-and-fire model neurons. The resulting spiking activity was recorded and compared with experimental data using power spectra and Granger causality analysis. The results indicate that both mechanisms can generate and spread alpha oscillations but with different laminar patterns. The first mechanism suggests that the alpha rhythm mainly originates in layers 5 and 2/3 (similar as in [4]), while the second mechanism points to layers 4 and 6 (similar as in [5]). Combining both mechanisms results in a summation of effects, with the alpha range emanating from all layers. The findings suggest that the two mechanisms may contribute differently to alpha rhythms, with distinct laminar patterns, and may be expressed either separately or in tandem under different conditions.

        Speaker: Renan Oliveira Shimoura (INM-6/IAS-6/INM-10, Forschungszentrum Jülich, Germany)
      • 09:55
        NEST Desktop: What are next steps? 20m

        NEST Desktop is a web-based GUI application for NEST Simulator [1, 2]. It has been established as an useful tool to guide students and newcomers to learn the concept of computational neuroscience. Here, the user is able to perform virtual experiment on local machine or on public website [3].

        In the last years NEST Desktop has been enhanced to communicate with different applications such as Insite (for showing activity during live simulation) [4] and NeuroRoboticPlatform (NRP) [5], ViSimpl (a visualization application) [6].

        I will demonstrate various scenarios using NEST Desktop where the user can
        · explore behavior of neuron models
        · apply other provided neuron models, e.g. multi-compartmental models
        · visualize synaptic weights in the network with plasticity (STDP, Tsodyks).

        Furthermore, I will exhibit ideas regarding NEST Desktop
        · with NESTML to build custom neuron models
        · with Elephant to better analyze simulation data
        · for various simulation tools, e.g. PyNN, Norse.

        Speaker: Sebastian Spreizer (University of Trier, Germany)
    • 10:15 10:30
      Group photo & short break

    • 10:30 12:00
      Workshop: Brent Huisman Zoom

      Zoom

      • 10:30
        Arbor: when you really need compartments 1h 30m

        Sometimes you might phrase research questions such that cell morphologies are needed to answer them. For instance: you are interested in a biological question: how are new connections formed? Which chemicals play a role, at which concentrations? Or you have a wetlab experiment you wish to model: you have measured local field potentials of a tissue with a new and microscopic sensor because you suspect the dendritic morphology plays a role. Although NEST offers options to model few-compartment cells, you may want or need more. Enter Arbor. Née Nest-MultiCompartment, it is a fresh start in the world of morphologically detailed simulators that shares with NEST a focus on ease-of-use and scaling up to large networks well. In this tutorial, we will show you how we create an Arbor simulation based on a NEST experiment, and observe how morphologies change measurements.

        Speakers: Brent Huisman (JSC, Forschungszentrum Jülich, Germany), Thorsten Hater (JSC, Forschungszentrum Jülich, Germany)
    • 12:00 12:40
      Lunch Break 40m
    • 12:40 13:40
      Talks: Jessica Mitchell, Behnam Ghazinouri, Jose Villamar Zoom

      Zoom

      • 12:40
        NEST Community Services 20m

        The team at NEST Simulator strive to develop a great tool and related services for its community. Here we share some highlights of the impact NEST has had on the computational neuroscientific community, and beyond. And we provide an overview of the state of documentation, install packages, and online services.

        NEST also provides a wealth of documentation for working with NEST, including technical documentation and contribution guidelines. We offer various package and container solutions to make NEST available for our community. Alongside this, we have a JupyterHub service, with no installation required.

        With these endeavors, we aim to facilitate research with NEST and foster community engagement.

        Speakers: Jessica Mitchell (INM-6/IAS-6/INM-10, Forschungszentrum Jülich, Germany), Steffen Graber (INM-6/IAS-6/INM-10, Forschungszentrum Jülich, Germany)
      • 13:00
        Navigation and the Efficiency of Spatial Coding: Insights from Closed-Loop Simulations 20m

        Spatial learning is critical for survival and its underlying neuronal mechanisms have been studied extensively. Much is known about the neural representations of space, e.g. place cells(PC) and border cells(BC) in the hippocampus. However, little is known about the functional role in spatial navigation and spatial learning. We extended an existing computational modeling tool-chain to study the functional role of spatial representations using closed-loop simulations of spatial learning.
        In our model an artificial agent had to find a hidden goal in an open-field environment. The model network consisted of PCs that tile the environment and BCs that represent its edges powered by NEST[1]. Their activity varies over time and is a function of the 2D location of the agent in the environment. This input was fed to 40 action selection neurons that each represent one direction of movement, distributed homogeneously across 360°. Therefore, the agent was able to move freely in any direction. If the agent enters the reward zone, learning is reinforced by potentiation of feedforward weights in a symmetric STDP learning rule with eligibility trace.
        Efficiently encoding spatial information is critical for navigation performance. Parameters of PC, such as their number, field sizes, peak firing rate, and the size of the goal zone, influenced navigation performance. We showed that he overlap index, which measured the degree of overlap between neighboring PCs, showed a nonmonotonic relationship with performance. In contrast, the Fisher information, which describes how informative the PC population is, best accounted for navigation performance in the model[2].

        Speaker: Behnam Ghazinouri (Ruhr-Universität Bochum, Germany)
      • 13:20
        Accelerating Neuronal Network Construction through Dynamic GPU Memory Instantiation 20m

        Efficient simulation of large-scale spiking neuronal networks is important for neuroscientific research, and both the simulation speed and the time it takes to instantiate the network in computer memory are key factors. In recent years, hardware acceleration through highly parallel GPUs has become increasingly popular. Similarly, code generation approaches have been utilized to optimize software performance, albeit at the cost of repeated code regeneration and recompilation after modifications to the network model [1].
        To address the need for greater flexibility in iterative model changes, we propose a new method for creating network connections dynamically and directly in GPU memory. This method uses a set of commonly used high-level connection rules [2], enabling interactive network construction.
        We validate the simulation performance with both consumer and data center GPUs on a cortical microcircuit of about 77,000 leaky-integrate-and-fire neuron models and 300 million synapses [3], and a two-population recurrently connected network designed to allow benchmarking of a variety of connection rules.
        We implement our proposed method in NEST GPU [4,5] and demonstrate the same or shorter network construction and simulation times compared to other state-of-the-art simulation technologies. Moreover, our approach meets the flexibility demands of explorative network modeling by enabling direct and dynamic changes to the network in GPU memory.

        Speaker: Jose Villamar (INM-6/IAS-6/INM-10, Forschungszentrum Jülich, Germany)
    • 13:40 13:50
      Short break 10m
    • 13:50 14:35
      Mingle Gathertown

      Gathertown

    • 14:35 14:50
      Wrap-up Gathertown

      Gathertown

      Convener: Abigail Morrison (INM-6 Forschungszentrum Jülich)