Towards an Exascale Hyperbolic PDE Engine
Principal Investigator:
Michael Bader(1), Alice-Agnes Gabriel(2)
Affiliation:
(1)Technical University of Munich, (2)Ludwig-Maximilians-Universität München
Local Project ID:
pr48ma
HPC Platform used:
SuperMUC and SuperMUC-NG of LRZ
Date published:
Introduction
Hyperbolic conservation laws model a wide range of phenomena and processes in science and engineering – from various areas of fluid dynamics via seismic wave propagation in earthquake simulation to extreme objects in astrophysics, such as neutron stars or black holes. The ExaHyPE engine [2] has been developed to solve an as big as possible class of hyperbolic systems of partial differential equations (PDEs) using high-order discontinuous Galerkin (DG) discretisation with ADER (Arbitrary high order DERivative) time stepping and a-posteriori sub-cell finite volume limiting on tree-structured Cartesian meshes. ExaHyPE thus relies on a well-defined numerical model and mesh refinement strategy, but strives for utmost flexibility regarding the tackled hyperbolic model.
The ExaHyPE SuperMUC-NG project accompanied a Horizon 2020 project [1] to develop the ExaHyPE engine [2], together with a suite of example models [3] and two large demonstrator applications from earthquake simulation and from relativistic astrophysics (see, e.g., [4,5]).
The ExaHyPE Engine
Users of the ExaHyPE engine need to specify details (number of quantities, discretisation order, etc) of their desired hyperbolic PDE system and model setup via a specification file. From this, the ExaHyPE Toolkit creates application-specific template classes, glue code and core routines, which are tailored to the application and target architecture. Application developers then implement PDE-specific functionality (flux functions, initial and boundary conditions, e.g.) within this generated code frame.
ExaHyPE builds on Peano as framework for adaptive mesh refinement, which provides shared- and distributed-memory parallelism via Intel Threading Building Blocks and MPI, respectively. Figure 1 illustrates the components of the ExaHyPE engine.
Earthquake Simulation
One of ExaHyPE’s main target applications is the simulation of earthquakes and other scenarios governed by seismic wave propagation. In particular, we study the scattering of seismic surface waves in regions with complex topography properties, which is especially challenging for steep valleys or summits – an example is shown in Figure 2. To capture respective effects with high accuracy it is important to represent the topography as detailed as possible. This is often addressed by geometry- and feature-following meshes, for which the generation is often tedious and requires user intervention.
To avoid such mesh generation issues, we developed two methods that avoid or completely automate this problem. The first is a Diffused Interface Method, which represents complicated geometry via a color function [5], the second is a curvilinearmesh method for which we exploit known properties of a simulated region to generate the mesh in-situ at initialization time. Both methods allow fully automatic geometry approximation, which makes them particularly attractive for Uncertainty Quantification or urgent seismic computing scenarios, where multiple simulation runs with a fast and automatic setup are required.
Relativistic Astrophysics
In astrophysics, the target application was to simulate the collapse of a binary system of two neutron stars. Such a merger will possibly create a black hole and a strong signal of gravitational waves. These scenarios are substantially more complex than the collapse of two black holes, which was the source of the 2016’s first experimental observation of gravitational waves. Predicting the exact signal of such events via simulations is substantial to experimentally identify and interpret observed black-hole and neutron-star events.
In the project we developed high-order ADER-DG methods for various PDE models that approximate the problems described above with increasing model complexity. In addition to excellent properties w.r.t. parallel scalability, DG methods promise high order of accuracy for smooth solutions and, in general, solutions that are less dissipative and dispersive than more widely used methods such as finite volumes. However, the development of high-order DG methods for models that contain the full complexity of relativistic astrophysics also pose numerical challenges that have not been solved yet.
To solve respective models in the ExaHyPE engine, the system of PDEs must be recast into hyperbolic, first-order form. Deriving this first-order form of the equation (FO-CCZ4) was non-trivial and one of the early outcomes of the project.
Using SuperMUC-NG we have run a sequence of models and scenarios of increasing complexity. We initially achieved the numerically stable evolution of isolated compact astrophysical objects, such as a single black hole with the FO-CCZ4 PDE and a single neutron star (TOV solution) with the GR(M)HD PDE, both solved for the first time with the novel ADER-DG scheme implemented in ExaHyPE [4].
Ongoing Research / Outlook
The ExaHyPE engine and the models based on ExaHyPE are further developed in various projects, such as the Horizon 2020 projects ChEESE (centre of excellence for exascale computing related to solid earth) and ENERXICO (energy and supercomputing for Mexico), the project TEAR granted to PI Alice Gabriel as an ERC Starting Grant or the UK-funded project ExaClaw (PI Tobias Weinzierl).
For CHEESE, in particular, we develop a pilot demonstrator for a future urgent computing service to assess potential hazards quickly after a detected earthquake, which is also the topic of a follow-up SuperMUC-NG project.
References and Links
[1] ExaHyPE project: www.exahype.eu
[2] ExaHyPE software: www.exahype.org
[3] A. Reinarz, et al., Comp. Phys. Commun. 254, 2020. http://dx.doi.org/10.1016/j.cpc.2020.107251
[4] M. Dumbser, F. Fambri, E. Gaburro, A. Reinarz, J. Comp. Phys. 404, 2020. https://doi.org/10.1016/j.jcp.2019.109088
[5] M. Tavelli, M. Dumbser, D. E. Charrier, L. Rannabauer, M. Bader, J. Comp. Phys. 386. 158189. 2019. https://doi.org/10.1016/j.jcp.2019.02.004
Research Team
Principal Investigators: Michael Bader1, Michael Dumbser4, Alice-Agnes Gabriel2, Luciano Rezzolla5, Tobias Weinzierl3
Researchers
Luke Bovard5, Dominic E. Charrier3, Kenneth Duru2, Francesco Fambri4, Jean-Matthieu Gallard1, Benjamin Hazelwood3, Sven Köppel5, Leonhard Rannabauer1, Anne Reinarz1, Philipp Samfaß1, Maurizio Tavelli4
1Technical University of Munich
2Ludwig-Maximilians-Universität München
3Durham University
4University of Trento
5FIAS – Frankfurt Institute for Advanced Studies
Project Partners
RSC Group, Bayerische Forschungsallianz (BayFor), Leibniz Supercomputing Centre (LRZ)
Scientific Contacts
Prof. Dr. Michael Bader
Technical University of Munich (TUM)
Chair of Scientific Computing in Computer Science (SCCS) of TUM’s informatics department
Boltzmannstr. 3, D-85748 Garching (Germany)
e-mail: bader [at] in.tum.de
https://www.in.tum.de/i05/bader/
Prof. Dr. Alice-Agnes Gabriel
Ludwig-Maximilians-Universität München (LMU)
Department of Earth and Environmental Sciences, Geophysics
Theresienstr. 41, D-80333 Munich (Germany)
e-mail: alice-agnes.gabriel [at] geophysik.uni-muenchen.de
NOTE: This report was first published in the book "High Performance Computing in Science and Engineering – Garching/Munich 2020 (2021)" (ISBN 978-3-9816675-4-7)
Local project ID: pr48ma
September 2021