Conveners
Parallelization and HPC Infrastructure
- Michael Meinel (Deutsches Zentrum für Luft- und Raumfahrt e.V.)
What does it take to develop and maintain a research code for the stochastic simulation of the physics of the strong interaction in Lattice Quantum Chromodynamics (LQCD)? How to make it run on the fastest supercomputers in the world? How many people are involved and what do they contribute when and how? What kind of development and interaction structures are useful? Which kinds of challenges...
Simulations based on particle methods, such as Smoothed Particle Hydrodynamics (SPH), are known to be computationally demanding, consisting on numerous interactions between locally-defined neighbors. Compared to other numerical methods, SPH is mesh-free, meaning that computations are not restrained to a fixed grid: particles, acting as interpolation nodes, are instead free to move across the...
Despite the immense computational power offered by HPC clusters(and the resources governments pour into obtaining these resources), it remains uncharted territory for many researchers. The primary deterrents include the perceived bureaucratic hurdles associated with accessing HPC resources, the opacity of usage procedures, and a notable lack of accessible support systems.
The intricate...
When it comes to enhancing exploitation of massive data, machine learning methods are at the forefront of researchers’ awareness. Much less so is the need for, and the complexity of, applying these techniques efficiently across large-scale, memory-distributed data volumes. In fact, these aspects typical for the handling of massive data sets pose major challenges to the vast majority of...