Any model can only be as good as the input and validation data. Models of various scales ranging from single soil columns to the complete earth system need time series of input data. Large scale models often rely on global products derived from satellite data or weather model re-analysis. However, classical on-site measured time series still serve as input of such compiled products and are...
While GPU computing has been widely used in science through the Tensorflow and Torch frameworks, and in specialized HPC applications, software that runs on end-user-devices often does not yet use these technologies.
In this presentation, we show how we used OpenGL compute shaders to accelerate key features of the software developed in the ValidITy project (https://validity-project.eu) to...
What are Digital Twins of the Ocean? Who does benefit from this data science tool? Which kind of new science can we address with them? How can we use Digital Twins to transfer knowledge and synthesise gains across research fields? Which building blocks of Digital Twins are required to step ahead? What can we build upon?
The term Digital Twins is a buzzword connected to many stakeholder...
The recent leaps in Data Science (DS) methodologies present a unique opportunity for many scientific topics to yield progress faster than ever, by leveraging increased capacity of analyzing data. However, major breakthroughs in most sciences are only achievable by combining these DS methodologies with large computing facilities, which have both the large computing capacity and the ability to...