Towards Large-Eddy Simulation of Primary Atomization of Liquid Jets

Principal Investigator:
Markus Klein and Sebastian Ketterl

Institute of Mathematics and Applied Computing, Bundeswehr University Munich

Local Project ID:

HPC Platform used:
SuperMUC of LRZ

Date published:


Atomization describes the disintegration of a liquid core into a large number of droplets. In order to improve the design of industrial devices, predictive computational methods are desired. While models for secondary atomization (drops or small liquid structures collapsing into smaller drops) are well established, primary breakup remains the major deficiency for predicting atomization by numerical tools. Especially in the vicinity of the liquid core, where experimental access is limited, numerical simulations help to gain insights in the mechanisms of turbulent liquid jet breakup.

Progress in numerical methods allows computations of primary breakup by means of Direct Numerical Simulation (DNS), at least for academic cases at moderate Reynolds and Weber number. The wide range of time and length scales results in excessive computational costs and DNS for industrial devices will remain out of scope in the near future. Due to its ability to resolve large scale structures, Large-Eddy Simulation (LES) provides a good compromise in terms of accuracy and computational effort. LES for multiphase flows including a sharp phase interface remains to date a largely unexplored area. Because of the lack of spatial resolution not only turbulent but also interfacial structures remain subgrid. The complex coupling between turbulence and the phase interface at the unresolved scale needs to be modelled. The development of next generation models for large scale multiphase flows using input from DNS results is one of the most urgent challenges.

The project primarily aimed at the generation of a DNS data base of multiphase primary atomization. The DNS data served as starting point for the development of an LES framework. On the one hand, the fully resolved DNS flow fields were used for a-priori subgrid scale analysis, on the other hand, DNS flow statistics will be needed for the a-posteriori evaluation of the developed LES code. In order to avoid the development of closure models which are only valid for one specific configuration, a parameter study is realized for the DNS data base. Finally, within this project a new method to generate turbulent inflow data has been developed in order to allow realistic DNS computations.

Results and Methods

The one-fluid formulation of the incompressible isothermal Navier-Stokes equations is solved with the open source code “PARIS Simulator” [1]. The phase interface is advected by a geometrical volume-of-fluid method. Numerical methods and algorithms are explained in Tryggvason et al. [2]. The computational grid typically consisted of approximately 1.3 billion cells. The simulations were run on 9,216 cores. Highest grid resolutions included 2.1 billion cells run on 16,384 cores. In total, the project demanded approximately 18 million core hours. All simulations were run on phase 1 of SuperMUC.

Round and plane jets were analyzed. The Reynolds and Weber number were varied between Re=2,000-10,000 and We=2,000-5,000 respectively. The influence of the density and viscosity ratio between the gas and the liquid phase was investigated. Material parameters were chosen to represent a diesel injection. Figure 1 shows the primary breakup of a round jet. After turbulent injection, the jet immediately starts wrinkling. These corrugations grow, ligaments are being stretched and droplets are formed.

In order to collect a sufficient amount of independent samples to compile flow statistics, 15 flow-through times based on the centerline velocity were computed. Figure 2 exemplarily depicts statistics of jet breakup. The axial evolution of the jet half width and the centerline velocity are plotted on the left and velocity fluctuations in lateral direction for different axial positions are shown on the right.

The DNS flow field has been used for the development of LES models by means of a-priori analysis. A-priori analysis of the fully resolved flow field allows the identification of the most impacting subgrid scale parameters and provides helpful knowledge for modeling small scale effects. An order of magnitude study allowed a ranking of the subgrid scale terms by order of relative importance. Additionally, the impact of varying flow quantities e.g. density ratio, Reynolds and Weber number, on the subgrid scale could be identified.

For the most important subgrid scale terms, closure models have been proposed. A-priori analysis allows the assessment of closure models with respect to explicitly filtered DNS data. The accuracy of closure models has been excessively studied [3]. Existing modeling ideas from single phase flow, combustion and wall modeling have been transferred to multiphase flow. The detailed flow data from DNS computations enabled the development of two new closure models for the subgrid scale stress and for the scalar flux [3]. The new models performed equally or better than a variety of existing models that had been tested.

A crucial issue for successful numerical prediction of primary breakup is the prescription of realistic turbulent inflow at the injection nozzle. DNS and LES of spatially inhomogeneous flows strongly depend on turbulent inflow boundary conditions. Realistic coherent structures need to be prescribed to avoid the immediate damping of random velocity fluctuations. A new turbulent inflow data generation method based on an auxiliary simulation of forced turbulence in a box has been developed [4]. The new methodology combines the flexibility of the synthetic turbulence generation with the accuracy of precursor simulation methods. In contrast to most auxiliary simulations, the new approach provides full control over the turbulence properties and computational costs remain reasonable. The lack of physical information and artificiality attested with pseudo-turbulence methods is overcome since the inflow data stems from a solution of the Navier-Stokes equations. The generated velocity fluctuations are by construction divergence-free and exhibit the well-known non-Gaussian characteristics of turbulence.

On-going Research and Outlook

The project aims at the establishment of an LES solver for multiphase flow in order to predict the breakup of a liquid jet. The a-priori developed and assessed closure models are implemented in the LES code and are further analyzed by a-posteriori LES computations. The LES framework is evaluated by comparing first and second order statistics as well as droplet size distributions with the high resolution DNS results and first results have been presented in [5].

Due to the lack of a Kolmogorov scale equivalent for interfacial structures, the resolution demands for multiphase DNS are still topic of current research. A grid convergence study for atomization reached no convergence of the droplet probability density distribution, even for excessively refined grid far beyond the Kolmogorov scale. A follow-up project is planned to develop a mesh resolution criterion for multiphase DNS, equivalent to the Kolmogorov scale in single phase flows.


[1] Y. Ling, S. Zaleski, R. Scardovelli. 2015. Multiscale simulation of atomization with small droplets represented by a lagrangian point-particle model. Int J Multiphase Flow. 76, 122–143.

[2] G. Tryggvason, R. Scardovelli, S. Zaleski. 2011. Direct numerical simulations of gas–liquid multiphase flows. Cambridge University Press.

[3] S. Ketterl, M. Klein. 2018. A-priori assessment of subgrid scale models for large-eddy simulation for multiphase primary breakup. Comput Fluids. 165, 64–77

[4] S. Ketterl, M. Klein. 2018. A band-width filtered forcing based generation of turbulent inflow data and its application to primary breakup of liquid jets. Flow Turb Comb.

[5] S. Ketterl, M. Klein. 2017. A-priori and a-posteriori assessment of LES subgrid models for liquid jet atomization. Tenth International Symposium on Turbulence and Shear Flow Phenomena, Chicago

Scientific Contact:

Univ.-Prof. Dr.-Ing. (habil) Markus Klein
Fakultät für Luft- und Raumfahrttechnik
Institute of Mathematics and Applied Computing
Universität der Bundeswehr München
Werner Heisenberg Weg 39, D-D-85577 Neubiberg (Germany)
e-mail: markus.klein [@]

NOTE: This report was first published in the book "High Performance Computing in Science and Engineering – Garching/Munich 2018":

LRZ project iD: pr48no

January 2019

Tags: LRZ CSE Universität der Bundeswehr