Prof. Jörg Schumacher
Technische Universität Ilmenau (Germany)
Local Project ID:
HPC Platform used:
SuperMUC of LRZ
Turbulent convection flows, which evolve in horizontally extended domains, are often organized in prominent and regular patterns on scales that exceed the characteristic layer height. Furthermore these patterns, which we term turbulent superstructures of convection, evolve gradually with respect to time [1,2]. This large-scale organization challenges the classical picture of turbulence in which a turbulent flow is considered as a tangle of irregular and chaotically moving vortices and swirls. Examples for superstructures in nature are cloud streets in the atmosphere of our planet or the granulation at the surface of the Sun. In the latter astrophysical case, this structure formation is additionally affected by magnetic fields, which are generated inside the Sun. Our understanding of the origin of turbulent superstructures, their mechanics, their role for the turbulent transport of heat and momentum as well as the influence of magnetic fields on their structure is presently still incomplete. High-resolution direct numerical simulations of the equations of turbulent fluid motion in the simplest setting of a turbulent convection flow, Rayleigh-Bénard convection in a layer that is uniformly heated from below and cooled from above, aim at a detailed study of these superstructure pattern formation processes in several working fluids with very different kinematic viscosities and thermal conductivities. In order to fully resolve the flows in horizontally extended domains, we have to rely on massively parallel super-computers for our numerical investigations [2, 3].
Results and Methods
We solve the equations of motions that couple the dynamics of the velocity and temperature fields numerically. These are the three-dimensional Boussinesq equations of thermal convection. The external magnetic field is assumed to be strong such that we can apply the quasi-static limit of magnetohydrodynamics . The thermal driving by the applied outer temperature difference is quantified by the Rayleigh number Ra, the properties of the working fluid by the Prandtl number Pr, and the strength of the applied magnetic field by the Hartmann number Ha. Astrophysical flows are mostly found at very low Prandtl numbers, which cause a very vigorous fluid turbulence in the convection system. This is one point that makes our numerical simulations very challenging since all vortices have to be resolved. Two numerical methods are applied, a spectral element method  and a second-order finite difference method . The latter is used when convection with magnetic fields is considered. The simulation domains are closed square cells with no-slip boundary conditions at all walls. The sidewalls are thermally insulated. Typical production runs in domains with an aspect ratio of 25:25:1 required 16384 SuperMUC cores for the non-magnetic cases. The simulations with magnetic field require 4096 cores for a box of aspect ratio 4:4:1. All simulations are long-term runs that involved sequences of several 48-hour runs in a row. In the course of the two project years, this sums up to 80 million consumed core hours.
Figure 1 displays an example of turbulent superstructures of convection for our Rayleigh-Bénard setup. Regular roll patterns, reminiscent to those known from the onset of thermal convection, are visible once the small-scale turbulent fluctuations are removed (see the right panels).
We have conducted these studies for turbulent flows at different Prandtl and Rayleigh numbers and analysed the typical spatial and temporal scales of the superstructures. It is found that these scales depend on Rayleigh and Prandtl numbers. In particular the Prandtl number dependence turned out to be rather complex at a fixed Rayleigh number. We also connected the temperature superstructures in the midplane with the strongest thermal plume clusters, which are formed in the boundary layers close to the top and bottom walls. Our study provides thus a recipe to separate the fast small-scale turbulent fluctuations from the gradually evolving large-scale patterns. The analysis can thus be interesting for the modeling of mesoscale convection in natural systems.
It is known since the linear stability analysis by Chandrasekhar in an infinitely extended layer, that either a strong rotation about the vertical axis or a strong constant vertical magnetic field lead to a stabilization of the thermal convection flow. Laboratory experiments in rotating Rayleigh-Bénard convection in closed cells demonstrated however also that so-called wall velocity modes are formed. These modes persist to exist beyond Chandrasekhar’s calculated linear stability threshold. One further aim of our supercomputing project was to study if these wall-attached modes do also exist in magneto-convection with a strong vertical magnetic field, where they have not been observed experimentally so far. The influence of such a strong vertical magnetic field on the structure in an originally highly turbulent convection flow in a liquid metal is illustrated in Figure 2. The grid resolution of this box is 2048*2048*513 points. The very strong magnetic field expels convection rolls from the cell centre where heat can be carried only by diffusion. Turbulence is completely suppressed and fluid motion can proceed only in form of up- and downwelling jets, which are attached to the sidewalls. These jets persist to exist far beyond the Chandrasekhar threshold. The convective heat transport can proceed although the amount of transported heat is small. The challenge of these simulations is to resolve the very thin boundary layers at the top and bottom plates as well as those, which are formed at the electrically insulated sidewalls. We could demonstrate for the first time the existence of these wall modes in a magnetoconvection flow in the presence of a very strong vertical magnetic field.
On-going Research / Outlook
The presented numerical investigations would not have been possible without the use of the most powerful supercomputers. In both discussed examples, we studied turbulent convection in horizontally extended domains. The numerical effort for such runs grows typically with the square of the aspect ratio at a given Rayleigh and Prandtl number. The very large aspect ratio of 25 in the first part was necessary to minimize sidewall effects in the pattern formation processes and to reliably extract the typical superstructure pattern scales. The SuperMUC computer made furthermore long-term simulations possible that resolved the very slow dynamics of the turbulent super-structures for the first time. An important question of the future work will be how these superstructures vary once the Rayleigh number is further increased. In view to astrophysical convection phenomena a further decrease of the Prandtl number would be a second challenge that want to address in the near future. Along these lines machine learning methods, such as deep convolutional neural networks , are currently applied to reduce the very large amount of data and to extract essential physical information on the turbulent transport properties in these turbulent flows.
References and Links
 J. D. Scheel, M. S. Emran, and J. Schumacher, New J. Phys. 15, 113063 (2013).
 W. Liu, D. Krasnov, and J. Schumacher, J. Fluid Mech. 849, R2 (2018).
 A. Pandey, J. D. Scheel, and J. Schumacher, Nat. Commun. 9, 2118 (2018).
 A. Frasson, M. Ender, S. Weiss, A. Pandey, J. Schumacher and R. Westermann, Visual exploration of circulation rolls in convective heat flows, IEEE Pacific Visualization Symposium (PacificVis 2019), 202 (2019).
 E. Fonda, A. Pandey, J. Schumacher and K. R. Sreenivasan, Deep learning in turbulent convection networks, Proc. Natl. Acad. Sci. USA 116, 8667 (2019).
Dr. Najmeh Foroozani1, Dr. Dmitry Krasnov1, Dr. Ambrish Pandey1, Prof. Janet Scheel2, Prof. Jörg Schumacher1 (PI)
1Technische Universität Ilmenau (Germany)
2Occidental College, Los Angeles (USA)
Prof. Dr. Jörg Schumacher
Institut für Thermo- und Fluiddynamik
Technische Universität Ilmenau
Postfach 100565, D-98684 Ilmenau
NOTE: This report was first published in the book "High Performance Computing in Science and Engineering – Garching/Munich 2018".
LRZ project ID: pr62se