Project Vlasiator: Global Hybrid-Vlasov Simulation for Space Weather
Principal Investigator:
Minna Palmroth
Affiliation:
Finnish Meteorological Institute, Helsinki
Local Project ID:
PP12061111
HPC Platform used:
Hermit of HLRS
Date published:
The HPC resources of HLRS Stuttgart enabled the world’s first global runs of the near-Earth space using a hybrid-Vlasov approach at highest resolutions.
The domain where space weather phenomena occur is huge compared to any other computer simulation; an accurate and self-consistent simulation needs to cover a part of the solar wind, the entire near-Earth region and also preferably be coupled to the ionized upper atmosphere that is called the ionosphere. The space weather domain is also vastly complex: The physical phenomena may occur in spatial scales from a few kilometres to hundreds of thousands of kilometres, and in temporal scales from milliseconds to years. This sets a demanding topic for computer simulations, which need to be accurate both in small and large scales and be executed for at least a few hours, preferably days or weeks.
Prior to this project, which was made possible through the Partnership for Advanced Computing in Europe, PRACE, there were only local simulations covering a small region in space where this particular approach was used to investigate space weather processes. Therefore, the main novelty in this project, carried out by a team of scientists from the Finnish Meteorological Institute, Helsinki, was to apply the code in a large simulation volume. This task was so massive that many thought it impossible. The HPC resources of HLRS Stuttgart made available through PRACE enabled the world’s first global runs of the near-Earth space using a hybrid-Vlasov approach at highest resolutions so far. While the analysis of the results is still ongoing, the first results set the code Vlasiator as the world’s new benchmark in global kinetic hybrid-Vlasov codes. Vlasiator results show rich plasma phenomena that are much more complicated than has been thought earlier. Many phenomena that can be seen in the runs have only been observed by local spacecraft measurements before, but the mechanisms of explaining the phenomena have not been placed in a larger context.
As the solar wind impinges on the magnetosphere, it forms a shock similarly as there is a shock wave in front of a boat moving in water (see Figure 1). Charged solar wind particles are energized by the shock and reflected back into the solar wind generating the wave patterns in the foreshock region. The particles that are not reflected into the solar wind compose a region of hot plasma in front of the Earth’s magnetosphere known as magnetosheath. The researchers investigated the plasma processes within the foreshock and magnetosheath, and found that Vlasiator is able to reproduce the key features of solar wind - magnetosphere interactions. Characteristics of the backstreaming ion populations and associated electromagnetic waves are in agreement with the properties of ion velocity distribution functions and compressional magnetosonic waves typically observed in the Earth's ion foreshock region. The most striking difference with respect to other kinetic simulations is that the Vlasiator velocity distribution functions appear as noiseless and uniformly discretized functions similar to those seen in experimental data.
Figure 1: Global plasma density in an MHD simulation (left, representing the previous state-of-the-art in global simulations) and Vlasiator (center, the new benchmark in global simulations). Color-coding is plasma density, with blue sparse plasma and red dense shocked plasma. Solar wind flows from the right of the figure, and the Earth is the small blue dot within the black region. Earth’s magnetic field deflects and shocks the solar wind flow similarly as a rock within a river.
Copyright: Finnish Meteorological Institute, Helsinki, Finland (FMI)The current simulation represents the first use of a hybrid-Vlasov scheme for global magnetospheric simulations and thus no attempt was made to reach the ion gyroscales or the ion inertial scales in ordinary space. Such attempts would also require implementation of a generalized Ohm's law with Hall and electron pressure gradient terms of appropriate accuracy. However the scientists show that a number of well-known features of the collisionless bow shock and the ion foreshock can be simulated using the ideal Ohm's law and without resolving all kinetic scales.
One of the most intriguing findings the scientists made from the first quality PRACE runs was related to a certain type of a wave phenomenon within the Earth’s foreshock. In certain solar wind conditions, the foreshock exhibits oblique coherent large-scale waves, while in theory these waves should propagate parallel to the solar wind magnetic field. The obliquity has caused a puzzle, and hence it has been suggested that the incoming solar wind can refract the waves away from the perpendicular direction. However, Vlasiator shows that these waves are not all oblique to the same direction, making the refraction fail as an explanation as it would require that the waves would all be bent in the same direction. The researchers are currently investigating this intriguing finding in more detail.
The development of Vlasiator has produced two open source libraries; the parallel grid (DCCRG), and a library for monitoring the performance of large-scale codes, phiprof (phiprof). These are useful building blocks for a wide range of scientific and technical simulation software where the problem can be solved on a Cartesian grid.
In addition to Vlasiator, DCCRG was also used to parallelize the GUMICS MHD simulation. While physics based on MHD is not as rich as in Vlasiator, MHD is the only feasible solution at the moment for space weather forecast. Efforts for Vlasiator enabled also the FMI’s MHD code to be run in real time, as one of the few real-time MHD codes in the world.
The rich physics in the Vlasiator runs enables scientists to better identify what physics is missing from the space weather MHD simulations. This topic will be explored further in the future, and will be used to guide the improvement of GUMICS.
The simulations that were performed are very large and consume a lot of memory. Due to the amount of memory, the researcers had to run most of the simulations on up to 30k cores, which was possible thanks to the availability of petascale system Hermit of HLRS Stuttgart made available throught PRACE.
Scientific Contact:
PI: Prof Minna Palmroth – Finnish Meteorological Institute, Helsinki, Finland (FMI)
Co-PI: Prof Hannu Koskinen – University of Helsinki, Helsinki, Finland, also at FMI
Dr Sebastian von Alfthan – FMI
Dr Arto Sandroos – FMI
Dr Dimitry Pokhotelov – FMI
Dr Ilja Honkonen – FMI
MSc Yann Kempf – FMI
MSc Sanni Hoilijoki - FMI
Finnish Meteorological Institute
Earth Observation Unit
Erik Palmenin aukio 1, FIN-00101 Helsinki, Finland
e-mail: Minna.Palmroth@fmi.fi