Topics

Where did the ‘No-go’ theorems go?

27 June 2000

With quark-gluon calculations being extremely difficult, physicists have to use their ingenuity to get results. The most popular approach is to use powerful supercomputers to simulate a discrete space-time lattice. A recent workshop examined progress in the field.

At the smallest possible scales, physics calculations are extremely complicated. This is the dilemma facing particle physicists.

Lattice field theories were originally proposed by 1982 Nobel laureate Ken Wilson as a means of tackling quantum chromodynamics (QCD) – the theory of strong interactions – at low energies, where calculations based on traditional perturbation theory fail.

The lattice formulation replaces the familiar continuous Minkowski space-time with a discrete Euclidean version, where space time points are separated by a finite distance – the lattice spacing. In this way results can be obtained by simulations, but the computing power required is huge, requiring special supercomputers.

This methodology has been applied extensively to QCD: recent years have witnessed increasingly accurate calculations of many quantities, such as particle masses (including those of glueballs and hybrids) and form factors for weak decays, as well as quark masses and the strong (inter-quark) coupling constant. These results provide important pointers to future progress.

The romantic Ringberg Castle, with its panoramic view of the Bavarian Tegernsee, was the scene of a recent workshop entitled Current Theoretical Problems in Lattice Field Theory, where physicists from Europe, the US and Japan discussed and assessed recent progress in this increasingly important area of research.

Obstacles removed

Despite the many successes of lattice QCD, there are stubborn areas where little progress has been made. For instance, until recently it was thought that the lattice formulation was incompatible with the concept of a single left-handed fermion (such as the Standard Model neutrino). The notion of this chirality plays a key role for the strongly and weakly interacting sectors of the Standard Model. Furthermore, weak decays like that of a kaon into two pions have been studied on the lattice with only limited success.

A non-perturbative treatment of such processes is highly desirable, because they are required for our theoretical understanding of direct CP violation and the longstanding problem of explaining isospin selection rules in weak decays. However, there have been impressive theoretical advances in both of these areas, which were discussed at the Ringberg workshop.

Gian Carlo Rossi (Rome II) gave a general introduction to lattice calculations of KÆpp. By the early 1990s, all attempts to study this process on the lattice had been abandoned, because it was realized that the necessary physical quantity cannot be obtained from the correlation functions computed on the lattice. This Maiani-Testa No-go theorem was analysed in great detail by Chris Sachrajda (Southampton). Laurent Lellouch (Annecy) then described how the theorem can be circumvented by treating the decay in a finite volume, when the energy spectrum of the two-pion final state is not continuous, in turn violating one of the conditions for the No-go theorem to apply.

Furthermore, the transition amplitude in finite volume can be related to the physical decay rate. An implementation of this method in a real computer simulation requires lattice sizes of about 5-7 fm. This stretches the capacities of current supercomputers to the limit, but a calculation will certainly be feasible with the next generation of machines.

Guido Martinelli (Rome I) presented the decay from a different angle by relating it to the conceptually simpler kaon-pion transition. This strategy has been known for some time, and recent work concentrated on the final-state interactions between the two pions. The inclusion of these effects may influence theoretical predictions for measurements of direct CP violation. Given recent experimental progress in this sector, this is surely of great importance.

Many lattice theorists’ hopes of being able to study the electroweak sector of the Standard Model had been frustrated by another famous No-go theorem, this time by Nielsen and Ninomiya. This states that chiral symmetry cannot be realized on the lattice, which, for instance, makes it impossible to treat neutrinos in a lattice simulation.

Recently it has been shown how the Nielsen-Ninomiya theorem could be sidestepped: a chiral fermion (such as a neutrino) can be put on the lattice provided that its discretized Dirac operator satisfies the so-called Ginsparg-Wilson relation. Several solutions to this relation have been constructed, and the most widely used are known in the trade as “Domain Wall” and “Overlap” fermions.

Progress in understanding how nature works on the smallest possible scale depends on such theoretical and conceptual advances as well as sheer computer power

At Ringberg, Pilar Hernández (CERN) examined whether these solutions can be implemented efficiently in computer simulations. Obviously these more technical aspects have to be investigated before one can embark on more ambitious projects. Hernández concluded that the computational cost of both formulations is comparable, but substantially higher compared with conventional lattice fermions. In particular, her results indicate that the numerical effort needed to preserve chiral symmetry by simulating Domain Wall fermions is far greater than previously thought. This point was further explored during an open discussion session led by Karl Jansen (CERN) and Tassos Vladikas (Rome II). A conclusion was that conventional lattice fermions appear quite sufficient to address many – if not all – of the problems in applied lattice QCD.

As well as calculating hard results, the preservation of chiral symmetry on the lattice has also been exploited in the study of more formal aspects of quantum field theories. Oliver Bär (DESY) presented recent work on global anomalies, which can now be analysed in a rigorous, non-perturbative way using the lattice framework. SU(2) gauge theory coupled to one massless, left handed neutrino thereby leads to the lattice analogue of the famous Witten anomaly. Further work on anomalies was presented by Hiroshi Suzuki (Trieste), while Yigal Shamir (Tel Aviv) reviewed a different approach to lattice chiral gauge theories based on gauge fixing.

Among other topics discussed at Ringberg was the issue of non-perturbative renormalization, with contributions from Roberto Petronzio (Rome II), Steve Sharpe (Seattle) and Rainer Sommer (Zeuthen). The problem is to relate quantities (for example form factors and decay constants) computed on the lattice to their continuum counterparts via non-perturbatively defined renormalization factors. Such a procedure avoids the use of lattice perturbation theory, which is known to converge only very slowly.

The successful implementation of non-perturbative renormalization for a large class of operators removes a major uncertainty in lattice calculations. Furthermore, talks by Antonio Grassi, Roberto Frezzotti (both Milan) and Stefan Sint (Rome II) discussed recent work on QCD with an additional mass term which is expected to protect against quark zero modes. It is hoped that this will help in the simulation of smaller quark masses.

Many other contributions, for example two-dimensional models, Nahm dualities and the bosonizaton of lattice fermions, could also lead to further progress. However, the variety of topics discussed at the workshop underlines that lattice field theory is a very active research area with many innovative ideas. Progress in understanding how nature works on the smallest possible scale depends on such theoretical and conceptual advances as well as sheer computer power.

The Ringberg meeting was organized by Martin Lüscher (CERN), Erhard Seiler and Peter Weisz (MPI Munich).

Directions for lattice computing

Quantum physics calculations are not easy. Most students, after having worked through the solutions of the Schrödinger equation for the hydrogen atom, take the rest of quantum mechanics on trust. Likewise, quantum electrodynamics is demonstrated with a few easy examples involving colliding electrons. This tradition of difficult calculation continues, and is even accentuated, by the physics of the quarks and gluons inside subnuclear particles.

Quantum chromodynamics – the candidate theory of quarks and gluons – can only be handled using powerful computers, and. even then drastic assumptions must be made to make the calculations tractable. For example, a discrete lattice (several fm) has to replace the space-time continuum. Normally only the valence quarks, which give the particle its quantum number assignment, can be taken into account (the quenched approximation), and the myriad of accompanying virtual quarks and antiquarks have to be neglected.

The benchmark of lattice QCD is the calculation of particle masses, where encouraging results are being achieved, but physicists are still far from being able to explain the observed spectrum of particle masses. Future progress in understanding subnuclear particles and their interactions advances in step with available computer power.

To point the way forward, the European Committee for Future Accelerators recently set up a panel (chaired by Chris Sachrajda of Southampton) to assess both the computing resources required for this work and the scientific opportunities that would be opened up. The panel’s main conclusions were:

* The future research programme using lattice simulations is a very rich one, investigating problems of central importance for the development of our understanding of particle physics. The programme includes detailed (unquenched) computations of non perturbative QCD effects in hadronic weak decays, studies of hadronic structure, investigations of the quark-gluon plasma, exploratory studies of the non-perturbative structure of supersymmetric gauge theories, studies of subtle aspects of hadronic spectroscopy, and much more.

* The European lattice community is large and very strong, with experience and expertise in applying numerical solutions to a wider range of physics problems. For more than 10 years it has organized itself into international collaborations when appropriate, and these will form the foundation for any future European project. Increased coordination is necessary in preparation for the 10 Tflops generation of machines.

*Future strategy must be driven by the requirements of the physics research programme. We conclude that it is both realistic and necessary to aim for machines of the order of 10 Tflops processing power by 2003. As a general guide, such machines will enable results to be obtained in unquenched simulations with similar precision to those currently found in quenched ones.

* It will be important to preserve the diversity and breadth of the physics programme, which will require a number of large machines as well as a range of smaller ones.

* The lattice community should remain alert to all technical possibilities in realizing its research programme. However, the panel concludes that it is unlikely to be possible to procure a 10 Tflops machine commercially at a reasonable price by 2003, and hence recognizes the central importance of the apeNEXT project to the future of European lattice physics.

bright-rec iop pub iop-science physcis connect