Our research highlights serve as a collection of feature articles detailing recent scientific achievements on GCS HPC resources.
Methane-Oxygen diffusion flame at low (top) and high (bottom) Reynolds numbers. The increased turbulence allows a faster conversion of the reactants into products enhancing the combustion efficiency. Image credit: Chair of Space Propulsion, TUM.
Supercomputers around the world have spent many millions of core hours focused on one of humanity’s last and largest physics “grand challenges”—finding order in the seemingly random, chaotic motions of fluids, also known as turbulence. Turbulence impacts plane aerodynamics, fuel efficiency in combustion engines, and safety of certain industrial processes, among countless other applications.
Scientists and engineers have developed ways to work around turbulence, but despite turbulence having major implications on so many facets of our lives, researchers still have a limited understanding of turbulence on a fundamental level.
In recent years, researchers at the Department of Aerospace and Geodesy at the Technical University of Munich (TUM) have used high-performance computing (HPC) resources at the Leibniz Supercomputing Centre (LRZ) in order to better understand turbulence for space propulsion.
“You are more limited in your ability to measure information in experiments, because if you are in a high temperature or high-pressure environment, you can’t just use a probe to measure what is happening during an experiment,” said Andrej Sternin, lead researcher on the project.
With access to world-class HPC resources, the team has been running large direct numerical simulations (DNS) to better understand turbulent fluids’ behaviour in these environments.
Forward propulsion
Over the last decade, computer simulation has become one of the primary methods for advancing knowledge of turbulent fluid flows. In many cases, experiments are too expensive or time consuming to do in the frequent, iterative way necessary to advance scientific knowledge. In other cases, such as in fuel injectors for combustion engines or space propulsion systems, extreme heat, pressure, or radiation can make it difficult for researchers to get accurate measurements and observations.
Space propulsion particularly requires high pressure environments, which helps efficiently turn the energy generated by burning fuel into thrust for the engine.
While several different methods exist for modelling turbulence, most of them essentially follow a “divide and conquer” strategy—researchers model a fluid flow on a fine-grained grid that can break the system into many smaller cells for calculations, then researchers advance simulation time slowly so they can observe how the smallest loops, or eddies, influence the larger fluid flow.
DNS is the most accurate way to model these complex, subtle interactions. This method has no “assumptions” for how fluids will behave in the simulation, but also requires immense computing power—so much so, in fact, that most DNS is limited to only modelling small systems over short periods of time. To model larger systems, many researchers perform large-eddy simulations, which make assumptions about how the smallest eddies behave, then extrapolate that for the whole system.
In the interest of accuracy, though, Sternin and Daniel Martinez have been running “quasi-DNS” simulations in order to more fully understand these interactions for space propulsion systems. Quasi-DNS allows the researchers to reduce the computational costs, but still simulate the smallest eddies and their influence on the larger system.
With the help of LRZ staff, Martinez, Sternin, and their collaborators have developed a process to run suites of these simulations. “At this point, this looks like an industrial process, and for that it was necessary to start from ground principles and locate the bottleneck,” Martinez said. “We finally came to understand where we can advance in the state-of-the-art.”
By running high-accuracy DNS, the team is providing valuable insights into fluid behaviour and, in turn, generating valuable data that can be used to improve the inputs for simulations that must take more assumptions into account.
HPC centres provide added value for cutting-edge research
During the team’s allocation, it worked closely with LRZ user support specialist Martin Ohlerich. Being relative newcomers to computing at this extreme scale, the team needed support to improve its efficiency. Ohlerich more than delivered. “Martin was our guardian angel, and this is something you don’t encounter every day,” Sternin said. “He did a lot more than he had to, but I think he also had some fun and became part of our team. It wasn’t like a normal customer service experience, and it wasn’t possible to run our simulations at this scale without his help.”
Martinez also pointed out that the team’s advisor at TUM, Prof. Dr. Oskar Haidn, created a collaborative environment which motivated all members to devote themselves to their large, expansive research mission.
Moving forward, the team hopes to try an alternative computational method that has more recently been applied to turbulence modelling—smooth-particle hydrodynamics (SPH). Unlike traditional modelling of turbulence, SPH-based multi-physics simulations allow researchers to more geometrical flexibility in their simulations by treating the system as a collection of particles.
“SPH-based multi-physics gives us a better flow-structure interaction as well as the possibility to change mesh all the time without spending a lot of CPU power on that process,” Sternin said. “If we changed our mesh all the time with our current method, we would lose a lot of time, but this SPH method allows us to adapt our resolution more locally so we could afford to add even more aspects of chemistry, radiation, and the like into our simulations.”
-Eric Gedenk