Our research highlights serve as a collection of feature articles detailing recent scientific achievements on GCS HPC resources.
In 2015, astrophysicists, astronomers, and astrophiles celebrated an exciting development. For roughly 100 years, researchers had hypothesized the existence of gravitational waves — waves of gravity in space-time caused by large, violent events in the cosmos such as supernovas or neutron star mergers — but had never seen direct evidence confirming their existence. When the Laser Interferometer Gravitational-Wave Observatory (LIGO) in the United States definitively detected a gravitational wave event, researchers set out to build on this finding to advance astrophysics research.
“Since 2015, we’ve seen about 100 gravitational wave events. This kind of research is extremely new, and it has a huge potential for additional inputs that we can’t currently get through other observations,” said Prof. Tim Dietrich, researcher at the University of Potsdam. Since that time, Dietrich and his research group have been using high-performance computing (HPC) resources to simulate cosmic phenomena that produce gravitational waves. Specifically, the team has been using the High-Performance Computing Center Stuttgart’s (HLRS’s) Hawk supercomputer to simulate what happens when binary neutron stars collide. Over the last three years, the team has made significant strides modelling these complex celestial events in unprecedented detail, and simulations on Hawk have contributed to more than a dozen scientific journal articles in the process, including publications in the leading journals Nature and Science.
Neutron stars are essentially the fossil relics of massive stars that have reached their ends. When stars run out of fuel, they start to collapse before their outer layers expand explosively outward in a supernova. These massive events not only produce gravitational waves, but also project heavy elements and other materials across the universe. The remaining material cools (relatively speaking) to a brisk 1,000 degrees Celsius and further consolidates, becoming an ultra-dense neutron star. (The name comes from the fact that after a supernova, heavy neutrons comprise the bulk of the remaining materials.) If two of these objects drift too close to one another, strong gravitational pull causes them to merge, forming a much larger neutron star or creating a black hole in the process.
Astrophysicists can detect neutron star mergers through their observational signatures, such as gravitational waves. To complement these methods, scientists also simulate these events using supercomputers. This makes it possible to understand at a fundamental level how these events produce gravitational waves and electromagnetic signals, and eject materials across the universe.
To do that, though, researchers need to have a reliable model that can accurately represent the complex physics interactions taking place at a wide variety of scales within these massive systems. This requires world-leading HPC resources, and even today’s most powerful machines cannot completely simulate these events from first principles. For Dietrich and his collaborators, this has meant finding ways to improve computational efficiency without sacrificing realistic physics in their simulations.
For the team, simulating neutron star mergers realistically means including so-called multi-messenger physics information. As the name implies, multi-messenger physics collates information describing multiple physical phenomena to get a more comprehensive picture of materials’ behaviors at a fundamental level. Measurements of features including photons (light), a mysterious class of elementary particles called neutrinos, high-energy cosmic rays, and gravitational waves provide valuable, detailed information for researchers at both small and large scales, but are very difficult to integrate into a single simulation that accurately represents the entire system. “We need to perform 5,000 operations for the evolution of a single point in our computational grid,” said Anna Neuweiler, a PhD candidate in Dietrich’s group and collaborator on the project. “Of course, our grid is comprised of many points, so for even just one time evolution, we need a lot of capacity to compute and solve our equations.”
Using Hawk, the team has been able to get closer to an accurate simulation of neutron star mergers based on first principles by selectively lowering the resolution of portions of its simulation that are less pertinent to the research. In addition, the researchers compare multi-messenger physics data in their simulations with complementary heavy-ion collision experiments being run at specialized experimental facilities on Earth. This combined approach has enabled the team both to advance the state of the art in researching binary neutron star mergers and to create a reliable application that they hope to enrich with even more first-principles physics calculations in the years to come. While the team has run larger simulations on other systems, access to Hawk laid the foundations for their successful simulation approach in the recent past.
“I don’t point to one big accomplishment in our work, because all these developments aim at getting a better understanding of the physics. This means that even simulations that move incrementally are still necessary, and might become even more important down the line. It is more like a marathon than a sprint,” Dietrich said.
Having successfully improved its code’s computational efficiency, the team is now focused on ways to include even more detail in its simulations. As part of her PhD research, Neuweiler has begun including first-principles magnetic field calculations in the team’s code, leading to a significant increase in computational demands. “Understanding the role that magnetic fields play is mostly important for simulating what happens after neutron stars merge so that we have a more accurate description of how matter flows,” she said. “We would like to gain a more accurate description, and in principle we have additional equations and variables that we can use, but it will be more computationally expensive than what we are currently doing.”
Dietrich also indicated that in the future the team would like to include accurate descriptions of turbulence at the smallest scales of their simulations, as well as details about neutrino physics that are becoming available as astrophysicists learn more about these mysterious particles. Researchers are also looking forward to the next predicted binary neutron star merger observation run in the May, 2023, and another one in 2026. With each new event the team will gain access to valuable observational data, enabling them to refine their simulations further. “There will be a lot of instances where we can use our simulations to interpret things better, and we need the resources to do the analysis of the computational data,” Dietrich said. “So, one thing is for sure—we will definitely not be asking for less computational time moving forward.”
Funding for Hawk was provided by Baden-Württemberg Ministry for Science, Research, and the Arts and the German Federal Ministry of Education and Research through the Gauss Centre for Supercomputing (GCS).
This article originally appeared on the High-Performance Computing Center Stuttgart website.