TMDs and Parallel Transport in QCD

Principal Investigator:
Andreas Schäfer

Institute for Theoretical Physics, Regensburg University

Local Project ID:

HPC Platform used:
SuperMUC and SuperMUC-NG of LRZ

Date published:


In January 2020, the US Department of Energy announced plans to build a new Electron Ion Collider (EIC) at Brookhaven National Laboratory. One of the primary goals of this accelerator will be the exploration of Transverse Momentum Dependent parton distributions (TMDs) in the proton. These functions have many fascinating properties. For example they parameterize angular asymmetries for mesons produced in the scattering of electrons and transversely polarized protons, as first observed by the HERMES collaboration in 2005 [1]. These asymmetries are naively counter intuitive as they seem to violate fundamental time reversal symmetry which is a property of Quantum Chromodynamics (QCD), the theory of quarks and gluons.

This asymmetry can be related to another counter-intuitive but also well-understood quantum mechanical effect, called the Aharonov-Bohm effect, see Figure 1: The interference pattern of a two-slit experiment with electrons moves, if a magnetic field is placed between the slits, even if it is shielded such that no electron can possibly penetrate into it.

While there are differences between these two phenomena the fundamental origin of them is the same and applies to all gauge theories: The gauge degrees of freedom lead to non-trivial parallel transport which introduces phase factors and interference effects which result in a surprising behavior of reaction probabilities.

Gauge theories have a geometric interpretation which links these phases to so-called gauge links. The non-trivial parallel transport of gauge links is generated by the field-strength tensor of the gauge theory in a similar way as non-trivial parallel transport in General Relativity is generated by the energy momentum tensor.

General Relativity is substantially more complex than Special Relativity and the physics of TMDs is substantially more complex than that of PDFs (parton distribution functions) or purely collinear physics, which depends only on longitudinal coordinates and momenta. In fact, for a purely collinear process all gauge links can be set to one and thus simply disappear. In an analogous manner any process which probes only a straight line cannot detect the warping of space-time in General Relativity.

TMDs depend on both, longitudinal and transverse momenta and are, therefore, sensitive to all such effects in QCD. This makes them both very complicated and very interesting objects.

Of the many fascinating new properties of TMD physics one of the most characteristic ones is the appearance of “soft factors”. Gauge links can be interpreted as a coherent superposition of arbitrarily many soft gluons and the interactions caused by these gluons cannot be described by standard rules of quantum field theories, which necessitated the introduction of new rules and a new dependence on a parameter called rapidity. To compare measurements at different rapidities requires knowledge of the so called “rapidity anomalous dimension”, which is a function of the transverse displacement, b, of the parallel transport. For values of b which are not very small this crucial function could until very recently only be guessed but not calculated. This has changed this year, see next section.

Before discussing these results a general remark on the status of Lattice QCD is in order. Because both the validity of QCD as such and the soundness of Lattice QCD are established beyond reasonable doubt, Lattice QCD has acquired a similar status as experiment and consequently the control of all sources of error, especially systematic errors has also become equally important. Just as the determination of systematic uncertainties is usually the most demanding task of a high energy experiment so it is for Lattice QCD. The calculations we performed are especially well suited to control these uncertainties.

Results and Methods

Because TMDs have properties which differ significantly from those of other lattice observables and because so many of their properties are still unknown, the emphasis of research for the different groups working on GPDs lies on the development of methods to extract the quantities of interest from a lattice simulation.

In our project we used a new method to analyze the TMD correlators we have calculated to extract the rapidity anomalous dimension, which is also called Collins-Soper kernel. This method was only published recently [2]. Among other advantages this method allows to extract the same information from, in principle, 16 independent correlations. While for many of them signal over background is probably too small for practical use this should provide a valuable tool to estimate systematic uncertainties. So far, we have analyzed only three of these correlations and the results agree within errors.

Also, we have analyzed different lattice spacings which allows us to study carefully the continuum limit which is usually the least controlled source of systematic error. This analysis is still ongoing. Figures 2 shows only one sample plot. This is a very busy plot because it contains results from five different approaches for the rapidity anomalous dimension as a function of the transverse distance b.

The main message of this figure is that all of these methods give compatible results. The blue-shaded band is the result of a fit to experimental data. Because this data does not really constrain its form for b larger than about 0.4 fm the shape of the band is determined primarily by the chosen parameterization. The result of a perturbative QCD calculation in next-to-next-to-next-to leading order (NNNLO) plus resummation is also unreliable beyond 0.4 fm. The MIT results (labeled Bernstein and Hermite) is from a quenched simulation and has hard to estimate systematic uncertainties in addition to the plotted ones.

The US-German-Chines LPC collaboration, to which we also belong, uses a completely new approach to extract the rapidity anomalous dimension from CLS configurations but was not part of this project. LPC has attributed a generous systematic error to these results in view of several conceptual issues which are still debated.

The pink, purple and brown dots are results of the present project. Ours were full QCD simulations (in contrast to quenched) and we also included so called higher twist effects in our analysis for which we, therefore, also obtained quantitative estimates. The large systematic errors are primarily driven by these higher twist effects. All lattice results came out only this year. In sum they demonstrate that this crucial fundamental function of modern hadron physics should be known with high precision in very few years.

This project was a PRACE (Partnership for Advanced Computing in Europe). PRACE distributes computer resources contributed by European HPC centers, including LRZ. We were granted 44 million core hours (mch) starting on SuperMUC Phase 2 and finishing on SuperMUC-NG (about 33 mch).  In addition to that, our project was supported with an extra allocation of computing time granted to frequent LRZ users in the start-up phase of SuperMUC-NG. Typically we used 24 nodes with 1152 cores.

The software we use is largely public domain within the lattice QCD community. It goes under the name of Chroma and is written primarily in C++. It is based on the QDP++ library. The main computational effort in lattice simulations goes into the inversion of huge sparse matrices for which one can choose between a large variety of highly optimized code using different algorithms (we used e.g. the OpenQCD multigrid solver [4]). We did not optimize these solvers but used them as black box. In this project we also used code we have co-developed as members of the TMD-Collaboration in the US several years ago for the specific purpose of analyzing TMDs, see [3]. The configurations analyzed were generated by the CLS collaboration (partly by us) in a large and long-term effort. These data are stored in Regensburg, Mainz and Berlin. Their generation was not part of this project. For data handling we used primarily HDF5.

Research Team

Michael Engelhardt2, Piotr Korcyl3,  Andreas Schäfer (PI)1, Maximilian Schlemmer1,  Alexey Vladimirov1, Thomas Wurm1,  Christian Zimmermann1

1Institute for Theoretical Physics, Regensburg University
2New Mexico State University, Las Cruces, USA
3TJagiellonian University, Krakow, Poland

Scientific Contact

Prof. Dr. Andreas Schäfer
Institute for Theoretical Physics, Regensburg University
D-93040 Regensburg/Germany
e-mail: andreas.schaefer [@] physik.uni-regensburg.de

NOTE: This simulation project was made possible by PRACE (Partnership for Advanced Computing in Europe) allocating a computing time grant on GCS HPC system SuperMUC (respectively SuperMUC-NG) of the Leibniz Supercomputing Centre (LRZ) in Garching/Munich.

LRZ project ID: pn56yo

December 2020

Tags: LRZ Universität Regensburg EPP PRACE