Our research highlights serve as a collection of feature articles detailing recent scientific achievements on GCS HPC resources.
When you think of turbulence, you might think of a bumpy plane ride. Turbulence, however, is far more ubiquitous to our lives than just air travel. Ocean waves, smoke from fire, even noise coming from jet engines or wind turbines are all related to turbulence.
A team of researchers at RWTH Aachen University’s Institute of Aerodynamics (AIA) has long been interested in using computation to understand turbulence—one of the major challenging mysteries of fluid dynamics—and how it relates to aircraft noise, fuel efficiency, or the transport of pollutants, among other research interests.
The team has been using the Cray XC40 Hazel Hen supercomputer at the High-Performance Computing Center Stuttgart to study turbulent multiphase flows—the movement of two materials in different states (such as solids and liquids) or materials in the same state that, for chemical reasons, cannot mix (such as oil and water). The team is also working to improve the accuracy of turbulence simulations on more modest computers.
Recently, the team published a paper in the Journal of Fluid Mechanics detailing its roadmap towards better modeling of turbulent multiphase flows. The work supports the team’s larger interdisciplinary goals. “This project is part of a bigger research unit where we research how to make coal power plants more environmentally friendly regarding their CO2 emissions,” said RWTH researcher Dr. Matthias Meinke.
During combustion, gases mix with tiny, solid particulates, meaning that realistic simulations can contain billions of these complex, multiphase interactions. To address the gigantic computational cost associated with such huge calculations, many researchers just use models for particle motion in a flow, lowering the computational cost by simplifying the simulation. However, these simplifications can also hurt the accuracy and, in turn, the predictive power of simulations.
The RWTH Aachen team wants to improve its computational models to account for the small interactions that have a big impact on turbulent flows. “We wanted to figure out a more detailed method that is necessary for us to understand these particle-laden flows when the particles are extremely small,” said Prof. Dr. Wolfgang Schröder, AIA Director and collaborator on the team’s project. “These particles actually define the efficiency of the overall combustion process, and that is our overall objective because, from an engineering perspective, we want to make the models that describe these types of processes more accurate.”
Scaling up by scaling down
Essentially, turbulence happens when a flow gets too excited. Be they liquids or gases, all fluids have some form of viscosity, which helps corral the kinetic energy (energy of movement) in a flow. If the energy in a flow is high, and the fluid isn’t thick, or viscous, enough to dissipate the energy, the movement goes from very orderly (laminar flow) to chaotic (turbulent flow). This chaos is passed down from larger to smaller scales until the fluid’s viscosity once again gains control of the flow by turning the kinetic energy into heat.
The smallest scale—where kinetic energy is transformed into heat and viscosity once again takes control of the flow—is called the Kolmogorov scale.
The team wanted to compute the turbulent flow up to the Kolmogorov scale with the most accurate fluid dynamics method possible.
Many researchers studying fluid dynamics problems related to turbulence use Large-Eddy Simulations (LES) to lessen the computational cost by making certain assumptions about what happens at the smallest scales. However, the most realistic way to calculate turbulent processes is using Direct Numerical Simulations (DNS). DNS allows researchers to make no assumptions at the smaller scales, meaning that the accuracy is improved, but the computational cost is higher.
Using Hazel Hen, the team was able to run a DNS simulation on a system of 45,000 particles with a size of the Kolmogorov scale. To the team’s knowledge, this is the largest simulation of particles at this scale to date, and serves as a benchmark for how other researchers studying this process can get more realistic simulation results. In order to have the “best of both worlds” in relation to the Kolmogorov-scale particles and the DNS simulations, the team absolutely had to have a world-class supercomputer and world-class support.
“Considering the final outcome, it would not have been possible to do this kind of research—to perform the calculations and do the analysis—without Hazel Hen. Without this machine, there would be no way to compete with other international research groups in this area,” Schröder said.
“It is tricky to get everything working as it should be, especially on such large-scale platforms,” Meinke said. “If we want to do post-processing, we need specialization. We are constantly testing new parallel file systems, because writing data back to the disk is a major bottleneck. For all these things, we are constantly in contact with and get valuable support from the HLRS staff.”
Accuracy for all
With the success of its large-scale DNS runs on one of the world’s fastest supercomputers, the team is now turning its attention to improving the accuracy of turbulence simulations for researchers who might not have access to supercomputers.
The team is starting to work on methods to integrate the data it received from its DNS simulations into simpler, less computationally intensive methods. Not only will this enable the team to do more simulations, it will allow for much larger simulations that can be done with a higher degree of accuracy.
This will not only benefit the researchers—it will also benefit industry. “We have to verify our simplified models so they are valid, and that is important for people designing coal power plants. They have to use such models, otherwise they can’t accurately predict the whole process,” Meinke said. These validated models will allow the researchers to predict the whole process more accurately.
As the Gauss Centre for Supercomputing delivers its next-generation systems to HLRS and its partner centres at the Jülich Supercomputing Centre and the Leibniz Supercomputing Centre, Garching near Munich, Schröder and Meinke are excited about diving into even more complex simulations.
“In our paper, we only consider spherical particles,” Schröder said. “There are other particles with a more needle-like shape with thin filaments, and these are necessary to simulate. We need to come up with a better model and generalize our analysis in such a way we can provide a model that can be used by other groups.”
–Eric Gedenk, e.gedenk@gauss-centre.eu