Multiscale Simulations of Fluid-Structure-Acoustic Interaction

Principal Investigator:
Sabine Roller

University of Siegen, Institute of Simulation Techniques and Scientific Computing (Germany)

Local Project ID:

HPC Platform used:
SuperMUC (LRZ)

Date published:

This project was part of the ExaFSA project that investigates the possibility to exploit high-performance computing systems for integrated simulations of all parts contributing to noise generation in flows around obstacles. Such computations are challenging, as they involve the interaction of various physical effects on different scales. In this context, the compute time on SuperMUC granted for this project was used to particularly investigate the coupling of the flow within a large acoustic domain with individual discretization methods.


The ExaFSA project [1] aims for the integrated simulation of both acoustic wave propagation and generation in flows around obstacles. The motivation behind this project is the inclusion of the interaction of structural movement with fluid motion in addition to the simulation of generated sound waves from this configuration. A basic sketch of the overall problem is shown in Figure 1.

The inclusion of the various physics and different scales involved in this overall problem has become possible by large computing facilities like LRZ by providing access to its SuperMUC HPC platform. Within this research, the current project is focused on computations and investigations of the coupling between the fluid domain and the acoustic wave propagation across larger distances. These two parts form an essential pair in the complete simulation setup, as typically they consume the largest part of the total compute time. The fluid domain computational demands come from its need for high resolution and the nonlinearity of the problem, while the acoustic propagation demands compute power due to the large domain that needs to be covered for the wave propagation.

The structural part of the simulation imposes further restrictions on the fluid domain, especially with respect to the time resolution. As a result, even though the fluid-structure interaction increases the computational costs drastically, those additional computations are still mostly centered around the fluid computation. Therefore, the first project within the context of these broader research goals concentrates on the fluid motion and acoustic wave propagation.

Results and Methods

The fluid and acoustic domain require different resolution scales. While we are dealing with high energies and small length scales in the fluid domain, we are facing large length scales and low energy fluctuations in the acoustic waves. A monolithic solution with the same resolution prescribed by the maximal requirements in either domain is not feasible from the computational point of view. Instead the two domains need to be separated to allow for a proper discretization with respect to each involved regime.

To enable a direct aeroacoustic simulation with the two-way interaction between the fluid and acoustic domains, the ExaFSA project uses a coupled approach, where the domains are spatially separated, and each part can be computed with different methods and discretization.

Figure 2 illustrates the generation of sound by a fluid in motion with a jet from a nozzle. The noise-generating fluid motion usually can be found in a comparably small area around the structure in the ExaFSA setting. Thus, a spatial separation can be naturally found for these scenarios. In Figure 2, the jet with turbulent fluid motion can be clearly separated from the area where pressure waves are transported. The idea in the coupled simulation is now to gradually simplify the equations to solve them as soon as possible. Starting with the full compressible Navier-Stokes equations, including friction, the first simplification that can be done is neglecting the viscous terms when there are no more shear flows. This results in nonlinear inviscous Euler equations. Finally, when the variation in the state gets sufficiently small, the nonlinearities can be neglected and only the linearized Euler equations need to be solved.

For the surface coupling between the individual domains, we use the general library preCICE [2]. This tool allows the coupling of various solvers via a generic interface description. This allows us to combine dedicated solvers for each physical domain. For the current investigation, we first utilize the same solver for both the fluid and the acoustic domain, albeit with different discretizations. Our solver, Ateles [3], is a Discontinuous-Galerkin solver with a modal basis. This is especially well-suited for linear problems like acoustic wave propagation. A high spatial scheme order can be utilized with this scheme, which reduces the required memory to represent the solution accurately.

Nonlinear equations, however, drastically increase the computational effort in this scheme for high orders. Nevertheless, we employ the solver for this project also for the fluid domain, though with a low spatial scheme order. Looking forward, we plan to replace the solver in the fluid domain.

During this project we computed several jet configurations and improved the scalability of the approach. The communication algorithm in preCICE was changed and some load-balancing was introduced to account for increased computational efforts for the interpolation in the high-order scheme.

A large part of the project was concerned with the generation of a reference solution with a large fluid domain without coupling. We then investigated the feasibility and accuracy of the coupled simulation. This showed good agreement with the produced reference and after the mentioned improvements also a good runtime. We use more than 20 million core hours on the various investigations, with around a third of it spent on the production of the reference for comparison.

In most runs, we used 16,384 cores per simulation, as this proved to be the most feasible queueing option. The resulting data is stored in one file per point in time and domain. Typically, 100 points in time were used for each simulation resulting in around 200 files where each contains the complete solution in space and is also suitable to restart the simulation. Tracking specific probes in the domain with a higher time resolution produces some other, smaller files. But these are only few and small. In the fluid domain, the solution requires around 10 GB of disk space for a mesh with around 30 million elements. In the acoustic domain with a high scheme order, less degrees of freedom are required and the solution data fits into files of less than half that size. In total, each simulation produced roughly 1.5 TB of data. In the course of the project, we utilized more than 10 TB of disk space for the various investigations.

On-going Research / Outlook

The just finalized phase 1 of the ExaFSA project focused on setting up the framework and gaining experience with the general quality of the coupling algorithms, especially on the data mapping between the processes. Future work focuses on the simulation of real-world fluid-structure-acoustic interaction to bring new insights into such applications and realize computational optimization of, for example, the sound design of aircraft or wind turbines.

We can now build on the foundation and improvements that we were able to establish during this initial project. The detailed simulation of the problem, including also the fluid-structure interaction, further increases the computational costs. The follow-up work on these large-scale investigations have already started in a new project.

References and Links

[2] H.-J. Bungartz, F. Lindner, B. Gatzhammer, M. Mehl, K. Scheufele, A. Shukaev and B. Uekermann. preCICE -- A Fully Parallel Library for Multi-Physics Surface Coupling. In Computers and Fluids. Elsevier, 2015.
[3] Jens Zudrop. 2015. Efficient Numerical Methods for Fluid- and Electro-dynamics on Massively Parallel Systems, Ph.D. Dissertation, RWTH Aachen, Germany

Scientific Team:

Principal Investigator: Sabine Roller1
Researchers: Verena Krupp1, Harald Klimach2
Project partners: Benjamin Uekermann2

1 Institute of Simulation Techniques and Scientific Computing, University of Siegen
2 Department of Scientific Computing (Informatics), TU München

Scientific Contact:

Prof. Dr.-Ing. Sabine Roller
Simulationstechnik und Wissenschaftliches Rechnen (STS)
Universität Siegen
Adolf-Reichwein-Straße 2, D-57068 Siegen (Germany) 
e-mail: sabine.roller[at]

Tags: LRZ Universität Siegen CSE