Our research highlights serve as a collection of feature articles detailing recent scientific achievements on GCS HPC resources.
In April 2019 an international team of researchers shared the first images ever created of a black hole.
This landmark accomplishment required hundreds of scientists collaborating over many years to collect and analyze huge datasets.
Among the coauthors of the first results, published in The Astrophysical Journal Letters, was Prof. Dr. Luciano Rezzolla at the Goethe University Frankfurt, who also serves as a principal investigator on two Gauss Centre for Supercomputing (GCS) projects. Using GCS high-performance computing (HPC) resources at the Leibniz Supercomputing Centre (LRZ) and High-Performance Computing Center Stuttgart (HLRS), his team created a model to describe plasma around the black hole in the middle of the Messier 87 (M87) galaxy. The investigators also developed a database of synthetic images of a black hole under different conditions that were compared with experimental observations to test their accuracy.
The overall project, named the Event Horizon Telescope (EHT) consortium, started in 2009 and set out to create detailed, high-resolution images of the M87 black hole. However, because black holes reflect no light and M87 is nearly 55 million light years away from Earth, simply pointing a camera and snapping a photo is not an option. Instead, researchers in the EHT collaboration combined observation data from radio telescopes with HPC simulations of phenomena in the vicinity of the M87 black hole — such as the properties of plasma and other materials — to generate images of physical phenomena that are impossible to see with the naked eye.
Integrating this radio astronomy data was a major undertaking. Considering the distance to M87, the black hole's massive size, and the complex fluid and particle dynamics resulting from its gravitational field, high-performance computing was essential in turning the raw data into something that researchers—and in turn, the world—could better understand. By using GCS HPC resources, Rezzolla and his colleagues provided one important piece of the puzzle.
One of the Rezzolla team’s goals is to understand how the laws of gravity change near the extremely strong gravitational pulls surrounding a black hole. In addition to being strong enough to prevent light from escaping, these forces can significantly distort the behaviour of astrophysical plasmas or other materials as they approach the centre. The team developed its own in-house codes: BHAC, which helps the team perform magneto-hydrodynamic (MHD) simulations to study plasma and other fluids’ properties; BHOSS, which enables the team to study how radiation motion changes near a black hole; and GENA, which allows the team to construct realistic images of black holes, ultimately enabling them to extract information from observation data. When combined together, the team gets an accurate picture to study how materials orbit around a black hole as they are being pulled inwards.
In addition to being a principal investigator on two GCS allocations, Rezzolla also serves as a PI on the European Research Council-funded (ERC) project, BlackHoleCam, which combines theory and observations to study the properties of supermassive black holes. The ERC awarded BlackHoleCam the largest-ever funding allocation for an astrophysics project.
The team knew it would need to recreate images by studying how light and plasma bend as they approach a black hole’s event horizon, the threshold where gravitational pull is so strong that no object can escape it. This phenomenon creates a “shadow” that envelops the surrounding area.
While the black hole’s shadow makes it difficult for researchers to “take a picture” in the traditional sense, it helps reveal the profound impact that black holes have on their surroundings, giving researchers the opportunity to learn more about black hole properties.
“Modeling plasma near a black hole is a highly nonlinear problem that includes a number of instabilities and turbulent flows,” Rezzolla said. “Such phenomena are difficult to model under normal circumstances, and these conditions are only amplified near a black hole. You are essentially studying motion happening close to the speed of light in an environment being distorted by an extreme gravitational pull.”
Using the Hazel Hen supercomputer at HLRS and the SuperMUC supercomputer at LRZ, as well as its own in-house cluster at Goethe University Frankfurt, Rezzolla's team successfully modelled the plasma dynamics near the centre of the M87. About half of the simulations used by the EHT were computed by the group in Frankfurt using HPC resources.
As part of their GCS allocations, the researchers also used HPC to run MHD simulations that can accurately model electromagnetic properties in materials, such as plasma, to develop a large synthetic image database—that is, "imitation" black hole images based on their simulations. The approximately 60,000 images in the database represent what a black hole would look like under a wide variety of conditions. By comparing them to the relatively few available observation images, researchers are able to differentiate properties that are unique to the M87 black hole from more general black hole phenomena.
To this end, the team developed GENA, a code based on a genetic algorithm (a class of algorithms inspired by evolutionary processes) that, when comparing the synthetic images with observations, identifies commonalities between the two and evolves them to a new “generation” that contains only the best “genes,” ultimately providing a good match between the simulated image and the observed one. Researchers repeat this process over several generations until they find the best match and isolate the synthetic images that are closest to the observations. “It is like entering a stadium with one blurred photo and needing to find that person in the crowd,” Rezzolla said. “We are improving our modelling of plasma, but also getting better at our ability to differentiate between stable and fluctuating features in these images.”
Beyond the horizon
The Rezzolla team is part of the early user program for LRZ’s next-generation supercomputer, SuperMUC-NG, and is active in AstroLab, one of LRZ’s several domain-specific community support application labs. The application labs allow users from specific scientific fields to exchange advice about best practices for using LRZ’s system and facilitate contact to specific members of the LRZ user support team. These staff members not only help with technical aspects of porting or debugging a code, but also understand the team’s scientific goals and help the team find topic-specific solutions.
The team credits its important contribution to the EHT project in part to its successful collaborations with both HLRS and LRZ staff. “This level of staff support is important value added for HPC centres,” Rezzolla said, indicating that support from GCS staff helped the team optimize its code for the centre-specific architectures. “At HLRS, Björn Dick and Stefan Andersson were a great help in optimizing our software for Hazel Hen, and LRZ’s Nicolay Hammer and Luigi Iapichino have been extremely helpful in dedicating their time and perspective.”
Moving forward, the team is focused on making sure that its codes can make the most of next-generation supercomputers no matter what system they are using. As part of the early user program at LRZ, which grants select researchers early access to SuperMUC-NG as it is being installed, the Rezzolla team got was able to begin porting its code to a new architecture right away, allowing them to make better use of the machine when it is fully online.
In the future, the team is looking forward to using next-generation supercomputers to better capture and understand images of other black holes, including the galactic core at the centre of our own Milky Way galaxy. The challenge for modelling our own galactic centre comes from its proximity and the speed at which galactic phenomena change.
“Looking at our galactic centre, there is an additional complication, because the timescale with respect to the image changing is shorter than the time scale that we can record the data,” Rezzolla said. “It is like trying to take a picture of something moving all over the place very quickly. All of our physical and technological expertise will be required to handle this new challenge and it is comforting to know that we can rely on the excellent supercomputing infrastructure that GCS makes available.”