HZDR Data Management Day 2023Hybrid Event

Europe/Berlin
255 (HZDR 106)

255

HZDR 106

Oliver Knodel (Helmholtz-Zentrum Dresden-Rossendorf), Stefan Mueller (Helmholtz-Zentrum Dresden-Rossendorf)
Description

This year's HZDR Data Management Day takes place on Tuesday, the 21st of November from 13:00 to 17:30 at the great lecture hall (106/255) at HZDR and via live stream (only available for HZDR members). Against the background of progressive digitization and an ever increasing need for reproducibility, the focus will be on the data management landscape at HZDR.

In more than ten talks and presentations, our colleagues will give insights into the data management challenges they're facing and their solutions. The talks will present both an overview over the most important tools and services available to the scientists at HZDR, and dive deeper into topics like Electronic Lab Documentation and Workflows. Furthermore, you will have the opportunity to exchange and discuss these topics further during the coffee breaks.

    • 13:00 14:45
      Data Management Day 2023: Session 1
    • 14:45 15:30
      Coffee Break and Poster Session
      • 14:45
        Automated Software Publication and Metadata Collection with HERMES for Improved Discoverability 45m

        Software as an important method and output of research should follow the RDA "FAIR for Research Software Principles". In practice, this means that research software, whether open, inner or closed source, should be published with rich metadata to enable FAIR4RS.

        For research software practitioners, this currently often means to follow an arduous and mostly manual process of software publication. HERMES (https://software-metadata.pub), a project funded by the Helmholtz Metadata Collaboration, aims to alleviate this situation. We develop configurable, executable workflows for the publication of rich metadata for research software, alongside the software itself.

        These workflows follow a push-based approach: they use existing continuous integration solutions, integrated in common code platforms such as GitHub or GitLab, to harvest, unify and collate software metadata from source code repositories and code platform APIs. These workflows include curation processes for the unified metadata, and deposit them on publication platforms. The deposits are based on deposition requirements and curation steps defined by a targeted publication platform, the author's institution, or a software management plan.

        In addition, the HERMES project works to make the widely-used publication platforms InvenioRDM and Dataverse "research software-ready", i.e. able to ingest software publications with rich metadata, and represent software publications and metadata in a way that supports findability, accessibility, and assessability of the published software versions. Subsequent to their publication, an additional step in the HERMES workflow registers software releases in software catalogues such as the Research Software Directory (RSD), or disciplin and project specific variants. In the future, the Helmholtz RSD will provide data for comprehensive knowledge graphs such as HMC unHIDE and thus increase the visibility of the software significantly.

        In summary, the project improves the publication and curation process as well as the discoverability of software publications by providing configurable tools that interact with common services.

        Speaker: Oliver Knodel (Helmholtz-Zentrum Dresden-Rossendorf)
      • 14:45
        Helmholtz Metadata Collaboration - Facilitating FAIR metadata in Helmholtz 45m

        Data are an essential part of every scientific endeavour. An efficient and future oriented research data management is therefore essential in order to ensure long-term availability of the generated data. This in turn ensures the reproducibility of scientific results. In order to facilitate FAIR data management within the Helmholtz community the incubator platform “Helmholtz Metadata Collaboration (HMC)” was established.

        HMC develops and provides services, tools and trainings to support and improve FAIR (meta)data management in the Helmholtz Association and aligns these approaches with national and international approaches and initiatives (e.g. RDA, EOSC, NFDI) to ensure compatibility with international research communities.

        To achieve this goal, HMC builds its work along three strategic areas: (1) Assessing and monitoring the state of FAIR data across Helmholtz, (2) Facilitating the connectivity of Helmholtz research data, and (3) Transforming (meta)data recommendations into implementations. At the centres, HMC supports research communities and data professionals with six research-field specific hubs: At HZDR HMC is represented locally by a unit dedicated to research field Energy and remotely by a unit for research field Matter. In our poster we will illustrate how research and data professional communities at HZDR can benefit from HMC's services, tools and trainings.

        Speaker: Florian Rau (Helmholtz-Zentrum Dresden-Rossendorf, HMC, Hub Energy)
      • 14:45
        Poster: Across the jungle of services towards the temple of knowledge 45m

        The relevance of sustainable handling of research data and software in accordance with FAIR principles was recognised early on at the Helmholtz-Zentrum Dresden - Rossendorf (HZDR). With this in mind, new (research) infrastructures such as the Invenio-based research data repository RODARE have been created step by step since 2017, tools such as the Research Data Management Organiser (RDMO) have been adapted and implemented at the centre and data and software policies have been agreed.

        Since then, the interest groups and stakeholders have been facing up to the general challenges of research data management (RDM) such as multidisciplinary requirements, multi-stage processes or downstream consideration in the course of projects, especially the specific situation at the HZDR. Independence and cooperation between the individual institutes go hand in hand at the centre. The resulting extensive range of scientific topics means that very specific requirements are repeatedly brought to the FDM, which have a major influence on the concrete design of the solution approaches.

        The HZDR has recognised the paramount importance of knowledge transfer in the RDM and has made important conceptual decisions to meet the aforementioned challenges in an agile manner. The RDM team of the library and the "Computational Science" department bundles the activities of all other important players (including "Programme Planning and International Projects", IT Infrastructure, Technology Transfer and Legal Department) around the topic. Joint developments result in measures to meet the requirements (metadata, transfer, documentation, guidelines, etc.) through the coordinated interaction of technical tools (further development of RODARE, integration of FIS and RODARE, HZDR Cloud, HELIPORT dashboard project, automated software publication via HERMES, GitLab, documentation in HZDR's internal MediaWiki, etc.) and services (training, workshops, consultations).

        In order to make the offered tools and services even better known and to implement them at the individual points of the research/publication cycle, the HZDR intranet will be redesigned as an important knowledge transfer platform for the individual value creation processes along the data life cycle and create clear and easy-to-use options for direct contact between science and service providers. The holistic approach chosen is intended to help shape all facets of Open Science in a sustainable way and actively meet future requirements.

        The poster is intended to invite constructive dialogue and serve as a source of inspiration for consistent further developments in RDM.

        Speaker: Maik Fiedler (HZDR)
    • 15:30 17:00
      Data Management Day 2023: Session 2
      • 15:30
        Data Driven Materials Science 15m
        Speaker: Rico Friedrich (HZDR)
      • 15:45
        ExL, not Excel 15m

        Manual data entry will always be part of data acquisition, at least to the level of comments, observations, ideas and decisions. Our use case of ExperimentLogging (ExL) at laser particle acceleration includes also the machine parameters, experiment parameters and detector settings, and was typically done as spreadsheet document tables. This hampers analysis due to scattered files and differing structures. We are developing a web-app to facilitate data entry from a browser-based form directly to a database. Metadata is added to assign entries to experimental campaigns, dates, teams and further context. Data fields and form layout can be configured within the app. Other ELNs like MediaWiki can be included as data sources. A live demo will be shown within this presentation.

        Speakers: Hans-Peter Schlenvoigt (HZDR), Kristin Elizabeth Tippey (HZDR)
      • 16:00
        Rock-IT — A collaborative approach for operating scientific plants 15m
        Speaker: Nicole Wagner
      • 16:15
        Data for Nuclear Astrophysics at the Felsenkeller Laboratory and in ChETEC-INFRA 10m

        Nuclear astrophysics is at the intersection of nuclear physics and astrophysics, united in the quest of understanding the chemical evolution of the Universe. ChETEC-INFRA is an EU-supported starting community of research infrastructures for nuclear astrophysics, and networks complementary types of such infrastructures: nuclear laboratories supply data on astrophysical reactions, HPC facilities perform stellar structure and nucleosynthesis calculations, and telescopes and mass spectrometers collect elemental and isotopic abundance data. At the HZDR Felsenkeller underground ion accelerator laboratory, nuclear reaction data is measured with a 5 MV tandem accelerator in a low-background environment thanks to the laboratory's rock overburden.

        The interdisciplinary nature of research in nuclear astrophysics is based on a wide range of types of data. In my poster I will showcase examples of the role of data in nuclear astrophysics research, as well as data-related efforts within the nuclear astrophysics community, including ChETEC-INFRA. Current considerations for nuclear data collected in experiments at the Felsenkeller laboratory will be presented.

        Speaker: Dr Axel Boeltzig (Helmholtz-Zentrum Dresden-Rossendorf)
      • 16:25
        Open Tools for Technology Transfer: Technology transfer and open science interaction | Business models for sustainable Open source projects | Open vs Proprietary mapping for impact maximization 15m

        The OpenTransfer project was created to develop a new OPEN toolkit for Technology Transfer Offices. It aims to develop new approaches to industry interaction based on open science ideas.

        Open has different flavours, and OpenTransfer aims to define and exploit them to maximize the impact of scientific results.

        The major topics are:
        - Business models based on Open IP
        - Decision-making support in the context of open licencing
        - Legal and Export control issuers

        Speaker: Dr Vladimir Voroshnin (HZDR)
      • 16:40
        Software Policy 10m
        Speaker: Tobias Huste (Helmholtz-Zentrum Dresden-Rossendorf)
    • 17:00 17:30
      Coffee Break and Poster Session