Massively-Parallel Molecular Dynamics Simulation of Fluids at Interfaces Gauss Centre for Supercomputing e.V.


Massively-Parallel Molecular Dynamics Simulation of Fluids at Interfaces

Principal Investigator:
Martin Thomas Horsch, Maximilian Kohns

Laboratory of Engineering Thermodynamics, Technische Universität Kaiserslautern

Local Project ID:

HPC Platform used:
SuperMUC of LRZ

Date published:


Molecular modelling and simulation is an established method for describing and predicting thermodynamic properties of fluids. It is well suitable for investigating phenomena on small length and time scales; often, however, scale-bridging series of simulations are needed to facilitate a reliable extrapolation from the nanoscale to the respective technically relevant length and time scales. The supercomputing project SPARLAMPE ("Scalable, Performant And Resilient Large-scale Applications of Molecular Process Engineering") examines interfacial properties of fluids, their contact with solid materials, interfacial fluctuations and finite-size effects, linear transport coefficients in the bulk and at interfaces and surfaces as well as transport processes near and far from equilibrium. These phenomena are investigated by massively-parallel molecular dynamics (MD) simulation, based on quantitatively reliable classical-mechanical force fields. The simulation results are combined to obtain an understanding of the complex processes undergone by cutting liquids during machining, in particular in the region of contact between the tool and the work piece.

With efficiently parallelized MD codes, scale-bridging simulation approaches for systems containing up to a trillion molecules have become feasible in recent years. Here, the program ls1 mardyn is used, i.e., an in-house code which is developed in collaboration with multiple academic partners [1], beside LAMMPS, which is externally developed free software.

Results and Methods

Twenty publications have so far appeared on the basis of the computational resources allocated by LRZ within the SPARLAMPE supercomputing project. The representative results which are briefly illustrated here concern quantitatively accurate modelling of the vapour-liquid surface tension of real fluids [2], cf. Figure 1, wetting of structured surfaces [3], cf. Figure 2, and molecular simulation of the processes experienced by cutting liquids in nano-machining processes [4], cf. Figure 3.

The present MD simulations were carried out with ls1 mardyn [1-3] as well as LAMMPS [4]; the boundary conditions mainly correspond to the canonical ensemble, i.e., to constant N, V, and T. Concerning computational requirements, four major types of simulation runs exist:

(1) Test runs with small systems, or production runs for small single-phase systems; supercomputing resources were not needed for this purpose, except for very few test runs concerning the SuperMUC environment itself. Such simulations are always required to a limited extent.

(2) Scenarios where of the order of 30 to 300 simulations need to be carried out with different model parameters or boundary conditions, where the simulated systems are heterogeneous (which makes them computationally less trivial and requires a greater number of simulation time steps) and typically contain of the order of 30000 to 300000 molecules. The vapour-liquid surface tension simulations [2] and the three-phase simulations of sessile droplets on structured solid substrates [3] are of this type.

(3) Scenarios where a small series of computationally intensive production runs need to be carried out; large systems, particularly if they involve fluid-solid contact and even more so if the simulated scenarios are inherently dynamic in nature, also require a large number of simulation time steps. Here, this is the case for the MD simulations of nano-machining processes [4], cf. Fig. 3, where five million interaction sites were included, and even though the simulation parameters were varied to a lesser extent than for the other scenarios, simulations needed to be repeated a few times to facilitate an assessment of the validity and the uncertainty of the simulation outcome.

(4) Scaling tests in the narrow sense, where simulations are conducted with the main purpose of analysing the strong and/or weak scaling of a code for a particular application scenario on a particular platform. These simulations by design typically cover all the range of available scales, up to the whole cluster. Nonetheless, the resource requirements are limited, given that only few time steps are needed. No such results are shown here; however, from such a test on SuperMUC, the present MD code ls1 mardyn recently renewed its MD world record in terms of system size [5] with resources allocated for the present supercomputing project.


Stefan Becker, Debdip Bhandary, Edder García, Hans Hasse, Michaela Heier, Kai Langenbach, Martin P. Lautenschlaeger, Steffen Seckler, Simon Stephan, Katrin Stöbener, Nikola Tchipev, Jadran Vrabec, Stephan Werth

Project partners

Scientific Computing in Computer Science, Technische Universität München

Thermodynamics and Thermal Separation Processes, Technische Universität Berlin

References and Links


[2] S. Werth, K. Stöbener, M. Horsch, and H. Hasse. 2017. Simultaneous description of bulk and interfacial properties of fluids by the Mie potential. Mol. Phys. 115, 9-12, 1017-1030.

[3] S. Becker, M. Kohns, H. M. Urbassek, M. Horsch, and H. Hasse. 2017. Static and dynamic wetting behavior of drops on impregnated structured walls by molecular dynamics simulation. J. Phys. Chem. C 121, 23, 12669-12683.

[4] M. P. Lautenschlaeger, S. Stephan, M. T. Horsch, B. Kirsch, J. C. Aurich, and H. Hasse. 2018. Effects of lubrication on friction and heat transfer in machining processes on the nanoscale: A molecular dynamics approach. Procedia CIRP 67, 296-301.

[5] N. Tchipev, S. Steffen, M. Heinen, J. Vrabec, F. Gratl, M. T. Horsch, M. Bernreuther, C. W. Glass, C. Niethammer, N. Hammer, B. Krischok, M. Resch, D. Kranzlmüller, H. Hasse, H. J. Bungartz, P. Neumann: TweTriS: Twenty Trillion-atom Simulation, The International Journal of High Performance Computing Applications 33 (2019) 838-854.

Scientific Contact

Jun. Prof. Dr.-Ing. Maximilian Kohns
Lehrstuhl für Thermodynamik (LTD)
Technische Universität Kaiserslautern
Erwin-Schrödinger-Straße 44, D-67663 Kaiserslautern
Email: maximilian.kohns [@]

LRZ project ID: pr48te

November 2019

Tags: LRZ Computational Fluid Dynamics University of Kaiserslautern TUM Technische Universität Berlin