Speaker
Description
The current variety of data from neuronal recordings paves the way to complementary approaches for understanding brain activity. At the same time, this poses the challenge to consistently compare data across experiments, species, and spatio-temporal scales, promoting the integration of multiple approaches from different neuroscience sub-domains. Also, experimental data are essential to benchmark and further develop theoretical models. These requirements can be fulfilled by defining a standardized, but still to some extent customizable, analysis workflow.
In the context of brain wave analysis, we developed Cobrawap (Collaborative Brain Wave Analysis Pipeline; Gutzen et al, 2022; https://cobrawap.readthedocs.io), a FAIR-compliant open-source software cooperatively developed as a HBP-EBRAINS UseCase. Written in Python3, it is structured as a collection of modular building blocks (that can be added, removed or replaced) arranged along sequential stages, implementing data processing steps and analysis methods, directed by a workflow manager. The final output of Cobrawap describes the cortical wave activity in a standardized manner through quantitative observables, also directly exploitable for model calibration and validation (Capone et al, 2023).
A general drawback of complex analysis tools is the need for some technical efforts for their initial installation and configuration, and the non-negligible demand for computational resources for their execution on local machines. Aiming at a wider community diffusion and user facilitation, recent efforts have been addressed to integrate Cobrawap as an EBRAINS component. In this regard, we succeeded in deploying Cobrawap on three FENIX-ICEI federated HPC sites (CSCS, JSC, CINECA). We made Cobrawap executable through ssh direct login and from the EBRAINS Collab through UNICORE; eventually, also the EBRAINS Workflows Dashboard will feature it. Data upload is ensured from local storages and from the HBP/EBRAINS Knowledge Graph, while single jobs are managed through SLURM.
A crucial step toward increased usability is represented by the development of dedicated CWL workflows, dynamically built at runtime through a “meta-approach” parsing configuration files delivered by users for their custom applications. Full back-compatibility with the native Snakemake workflow manager has been guaranteed. The goal is to minimize the time-to-result, letting the user focus on the scientific side without caring of the technology behind the scenes.
Among the latest Cobrawap scientific developments, a great effort has been dedicated to high-resolution recordings from brain imaging. Through suitable curation scripts, images are annotated and converted into standard formats (e.g. NEO). Then, we developed a recursive algorithm (“HOS”, Hierarchical Optimal Sampling) for optimizing the signal-to-noise ratio, so to dynamically tune the resolution across the field of view, decreasing it in the sub-regions where signal is less reliable (e.g. the boundaries).
Pushing further the portability and the user-friendliness of Cobrawap, we are going to deliver it as both a Docker image and a pip-installable Python package, regularly upgraded on both scientific and technological sides through a dedicated CI/CD pipeline.
This research is co-funded by the European Union's Horizon 2020 Framework Programme for Research and Innovation under Specific Grant Agreements No. 785907 (HBP SGA2) and No. 945539 (HBP SGA3) and the European Commission - NextGeneration EU (EBRAINS-Italy MUR CUP B51E22000150006).