National Center for Scientific Research/CNRS & Aix-Marseille University (France)
Local Project ID:
HPC Platform used:
JUQUEEN of JSC
The Standard Model of particle physics is one of the great scientific achievements of the 20th century. After confirmation of the existence of the Higgs boson in 2013, physicists are now keen to see whether there is anything beyond the theory. Scientists of CNRS and Aix-Marseille University have been been using lattice QCD to see whether a certain experimental measurement is indeed a glimpse of new fundamental physics.
The Standard Model of particle physics — the theory that classifies all known subatomic particles — has stood the test of time well. No experimental evidence has conclusively contradicted it since its current formulation was finalised in the mid 1970s. Despite this, there are a number of theoretical features of the model that have lead many to believe that it does not tell the complete story. Scientists around the world are searching for clues for what lays beyond it, the most prominent example being the work taking place at the Large Hadron Collider, which famously confirmed the existence of the final piece of the theory – the elusive Higgs boson.
Problems arise when trying to probe the Standard Model because some processes are so complex theoretically that they cannot be solved by the more traditional “pen and paper” approaches. That makes it very difficult to determine whether phenomena incorporating these processes are consistent with the predictions of fundamental theory. One such example of a measurement that could potentially deviate from the theory is the anomalous magnetic moment of the muon, a close cousin of the electron.
What is this measurement? Fundamental particles that carry spin, such as the muon, behave like tiny magnets. This effect is characterized by the magnetic moment of the particle. However, for more accurate values of this property to be calculated, the relativistic quantum effects of particles spontaneously flitting in and out of existence in the vacuum must be taken into account. These quantum fluctuations lead to small corrections which are collectively known as the anomalous magnetic moment. Most of these corrections can be predicted from first principles, but the hadronic contributions — the contributions from the strong nuclear force that holds quarks together to form particles such as protons and neutrons — are only beginning to be computed directly from the fundamental theory due to the need for new methods and the limitations of supercomputing power.
The most accurate experiment to date measured the anomalous magnetic moment of the muon with a relative precision of 5 x 10^-7. An experiment that will make a measurement four times as precise is currently being prepared. The most accurate prediction of the value had a precision of 4 x 10^-7. Interestingly, these values do not agree. “This is one of the best-known discrepancies between the Standard Model and experiments,” says Christian Hoelbling of the University of Wuppertal. “But, the current prediction of the value is based on models and interpretations of other experimental data. What we and other teams have been doing is using HPC resources to predict this value from first principles, directly from the underlying theory of the strong interaction. In this, our first attempt, we seek to compute the largest hadronic correction with sufficient accuracy to verify that the current determinations, based on other experimental data, are consistent with the fundamental theory. Eventually, the methods that are being developed will allow us to know for certain whether the experimental value actually deviates from the Standard Model.”
A group led by Dr. Laurent Lellouch of CNRS and Aix-Marseille University has been using the JUQUEEN supercomputer in Germany, Fermi in Italy, as well as Turing, a 1.2 Petaflop IBM BlueGene Q system, hosted by GENCI in France, to investigate this potential lead. The group’s method for predicting the value for the hadronic contribution to the anomalous magnetic moment of the muon is a two-stage process, the first of which is generating typical vacuum states of the theory. This involves simulating a small piece of space-time on the supercomputer. “We took a very small volume — no bigger than the volume of a few nuclei — and a very small period of time, and then solved the quantum chromodynamics equations used to describe the strong nuclear force.”
Quantum chromodynamics (QCD) is very hard to treat numerically: the fundamental degrees of freedom of the theory (quarks and gluons) are very different from the states that occur in nature. Usual perturbative techniques are insufficient. The problem is thus approached using a technique called lattice QCD. The QCD equations are discretised on a space-time lattice, and then the quantum mechanical path integration is performed stochastically. This approach is extremely demanding numerically. The integration has to be performed on a space with a dimension given by the number of lattice points times the internal degrees of freedom. Since the lattice has to be both fine enough to reproduce all essential details and big enough to fit the entire system being investigated, typical dimensions are in the order of several billion dimensions. Even for petaflop-class installations such as JUQUEEN, Fermi and Turing, it is a challenging task to perform this stochastic integration.
This part of the process has now been completed, and the very complex and numerically challenging analysis of the results is now taking place. The results of this analysis are due to be published shortly. Hoelbling explains the possible outcomes: “If we can confirm the experimental number of this magnetic moment, it would mean that the previous estimates of the number, which were not obtained from fundamental theory, were erroneous due to something which had been overlooked. This would basically close the door on the matter and we would be able to say that this measurement is in compliance with the Standard Model and does not require new physics to explain it. This is the most likely outcome, but from our point of view would also be the least exciting!
“If, on the other hand, the methods used eventually allow us to confirm that there was a real discrepancy, we would have found something very interesting. The first action would be to check whether there is something within the Standard Model, which has so far been overlooked, that can explain this. This seems unlikely, however. After that, we would have to conclude that there was a new particle that was causing this value to differ from what we expect.”
In time, any evidence for a new particle would give a prediction for the mass of the new particle. This could then be looked for directly at the Large Hadron Collider. “Our work would be providing indirect evidence of something new, which would then provide a strong incentive to search directly for this new particle via experiments. If evidence was found for the existence of a new particle, then we would be entering a new realm of physics which would require us to expand on the Standard Model.”
If (or when) the Standard Model is proven to be incomplete, it could well happen through a combination of the complementary fields of lattice QCD and particle accelerator physics. With the LHC already pushing the latter field beyond what has ever been done before, the highest level of supercomputing resources offered by PRACE could potentially do the same for lattice QCD.
This project was made possible through PRACE (Partnership for Advanced Computing in Europe) with HPC system JUQUEEN of the Jülich Supercomputing Centre (JSC) and HPC system Fermi at CINECA Italy serving as computing platforms.
The Budapest-Marseille-Wuppertal Collaboration (P.I. Laurent Lellouch, CNRS & Aix-Marseille University, France)
This is a reprint of the article published in the PRACE Annual Report 2015.
Laurent Lellouch (CNRS Research Director)
Centre de Physique Théorique
Campus de Luminy, Case 907
163 Avenue de Luminy
F-13288 Marseille cedex 9 (France)
e-mail: lellouch [at] cpt.univ-mrs.fr