I. Physikalisches Institut, Universität zu Köln
Local Project ID:
HPC Platform used:
SuperMUC of LRZ
Molecular clouds form out of the diffuse interstellar medium (ISM) within galactic disks and continuously accrete gas and interact with their surroundings as they evolve. Hence the evolution of turbulent, filamentary molecular clouds has to be modeled at the same time as the surrounding multiphase ISM. In the SILCC-ZOOM project, we simulate molecular cloud formation, the star formation within them, and their subsequent dispersal by stellar feedback on sub-parsec scales in 3D, AMR, MHD simulations with the FLASH code including self-gravity, radiative transfer, and a chemical network.
Stars form in molecular clouds (MCs), which are dense (mean number densities ≥100 cm-3) and cold (T ≈ 10 K) objects that form out of the warm and more diffuse interstellar medium in a galactic disk. MCs are short-lived, dynamically evolving and very turbulent. They are possibly assembled on time scales of a few to a few ten million years and develop a filamentary substructure. Further, the filaments fragment to form dense cores, which become self-gravitating and form stars.
When averaged over the whole galactic disk, the star formation efficiency in molecular gas, i.e. the fraction of molecular gas that is converted into stars, is observed to be only of the order of ~1%, which leads to a long depletion time scale of molecular gas. This is in apparent disagreement with the rapid evolution of MCs towards star formation. The solution to this dilemma could be the effective dispersal of the parental MCs from within – by feedback from newly born stars. In particular massive stars with a mass larger than ~8 times the mass of the Sun provide highly energetic feedback to the surrounding medium in the form of ionizing radiation and the associated radiation pressure, stellar winds, and supernovae. The feedback has been proposed to quench star formation inside the MC and to efficiently limit the accretion of fresh gas onto the evolving MC .
Overall, the early evolution of an individual MC and its star formation properties are closely connected to the properties of the surrounding interstellar medium (ISM). Hence, MC formation, evolution, star formation and feedback, should be modeled simultaneously and within the galactic environment. Due to the high physical complexity and dynamic range of the involved density and spatial scale, this is a computationally very challenging task.
In the SILCC-ZOOM Gauss project we modeled the formation of dense and cold molecular clouds from the multi-phase interstellar medium (ISM) and their subsequent dispersal by stellar feedback on sub-parsec scales. Therefore, we carried out novel three-dimensional (3D), adaptive-mesh-refinement (AMR), magneto-hydrodynamical (MHD), galactic zoom-in simulations with the MPI-parallel, finite-volume code FLASH .
Results and Methods
The calculations were carried out with our version of FLASH 4.3 , which includes a chemical network to treat heating by a background or a direct radiation field, radiative cooling and molecule formation [4 and references therein; based on e.g. 5], a tree-based method for self-gravity and radiative transfer , and sink particles with a star formation sub-grid model which follows the major evolutionary phases of the massive stars [2,7], in particular their wind and radiation output. The initial conditions are themselves based on simulations that were carried out in the SILCC project [1,2,4,7,8] under Gauss Call No. 7, which has been shown to reproduce a realistic multi-phase ISM with reasonable MC properties.
In different galactic environments simulated in SILCC, we identify the regions where MCs are about to form and zoom in on them using the AMR technique. Thus, we locally allow for a high spatial resolution (< 0.1 parsec) within a region with a side length of ~100 parsec. Throughout the zoom-in calculation, we continue to follow the full galactic environment at lower resolution. Therefore the MCs may accrete gas from the surrounding medium and could be heated and stirred by nearby supernova explosions [9, 10]. Typically a zoom-in simulation took 1 million CPU-hours.
The simulations naturally develop the observed internal turbulent and filamentary MC substructure (see  and Fig. 1 for an example of a formed MC in one of the zoom-in runs with solar neighborhood properties of the ambient multi-phase ISM). The simulations allow us to determine due to which physical process, e.g. turbulence, gravity, thermal instability, and/or magnetic fields, the filaments are imprinted. For instance, we find obvious striations off the formed filaments in MHD runs, which are in agreement with recent observations of the Taurus MC  and stem from magneto-sonic waves travelling through the cloud. These features are completely absent in purely hydrodynamical simulations.
On the resolution scale of ~0.1 parsec, sink particles are introduced. These include a sub-grid star cluster model and track the formation of individual massive stars and their associated feedback. First, by switching on each feedback process individually and in combination, i.e. ionizing radiation, radiation pressure and stellar wind, we carefully explore their relative impact on the ambient medium. The “impact” is quantified in terms of energy and momentum deposited in the MC . We confirm that radiative feedback is dominant in the dense and cold ISM (as has been shown in many previous works), but we can clearly show that a warm ambient medium (T > few 1,000 K) is dominated by stellar wind feedback because the radiation fails to couple to the gas in this case.
In realistic MCs as modeled in the SILCC-ZOOM simulations, massive stars are embedded within cold gas in the young star-forming cloud for the first million years. Thereafter, the stars break out of their birthplaces and start to dissolve the MC. Moreover, some of the massive stars escape as run-away stars. Therefore, in realistic MC environments, the initial phases of star formation are dominated by radiative feedback , while the stellar winds will become influential once they can leak out of the dense parts of the cloud. In Fig. 2 we show one MC at an evolutionary stage, where massive stars have already been formed and disperse the cloud from the inside.
In order to carry out these simulations, we developed a backward radiative transfer scheme (TreeRay), which is an extension to the tree solver for self-gravity and diffuse radiation presented in . The novel method has the great advantage that the amount of computational work does not scale with the number of sources as typical for most radiative transfer schemes. It also parallelizes very well. However, we are somewhat limited by the required amount of memory. Therefore, the optimal choice for us was to run each simulation on up to 2,000 cores, and to run several simulations in parallel. In total we used 67 million CPU-hours for the SILCC-ZOOM project. The overall storage needed to store the time-dependent 3D data was of the order of 100 TB.
On-going Research and Outlook
The simulations we performed are currently world-leading and would have not been possible without the computational resources we could use on SuperMUC. Multiple research papers are currently in preparation.
We are working on a hybrid parallelization scheme for our simulation code such that the current memory requirement can be reduced. We now aim for (1) simulating larger pieces of the galactic disk at high (sub-parsec) resolution and (2) zoom-in even to much smaller scales of a few Astronomical Units within some clouds to follow the star formation process, the fragmentation of the gas to individual stars, the formation of protostellar disks from self-consistent initial conditions, and protostellar feedback in the form of self-consistently driven jets and outflows, which all deserve detailed studies.
Daniel Seifried, Sebastian Haid, Thorsten Naab
Max-Planck-Institut für Astrophysik, Garching b. München.
References and Links
 Gatto, Walch + SILCC (2017), MNRAS, 466, 1903
 Walch + SILCC (2015), MNRAS, 454, 238
 Glover & Mac Low (2011), MNRAS, 412, 337
 Wünsch, Walch, et al. (2018), MNRAS, 475, 3393
 Peters + SILCC (2017), MNRAS, 466, 3293
 Girichidis, Walch + SILCC (2016), MNRAS, 456, 3432
 Seifried, Walch, et al. (2017), MNRAS, 472, 4797
 Seifried, Walch, et al. (2018), ApJ, 855, 81
 Heyer et al. (2016), MNRAS, 461, 3918
 Haid, Walch, Seifried, et al. (2018a), sub. to MNRAS
 Haid, Walch, Seifried, et al. (2018b), sub. to MNRAS
Prof. Dr. Stefanie Walch-Gassner
I. Physikalisches Institut
Universität zu Köln
Zülpicher Straße 77, D-50937 Köln (Germany)
e-mail: walch @ ph1.uni-koeln.de
NOTE: This report was first published in the book "High Performance Computing in Science and Engineering – Garching/Munich 2018".
LRZ project ID: pr62su