Comsol -leaderboard other pages

Topics

Quantum entanglement in high-energy physics 2023

The concepts of using quantum information methods and tools in high-energy physics are triggering more and more attention in our community in recent years, after the pioneering workshop which took place at the Brookhaven National Laboratory in 2018. We are convinced that now it is the time for the follow-up and the discussion of new achievements.

 

Registration:

Registration and call for abstracts were closed on April, 14.

Conference fee:

There is a conference fee of 200 EUR (900 PLN) which covers lunches and coffee breaks. Please make a bank transfer to one of the following bank accounts:

payment in PLN: PL07 1240 4722 1111 0000 4855 9692
payment in EUR: PL04 1240 2294 1978 0010 7072 2467

THE PRESENT AND FUTURE OF HEAVY FLAVOUR AND EXOTIC HADRON SPECTROSCOPY

A major goal in strong-interaction physics is to understand the nature of hadrons, which make up visible matter, and much research activity revolves around two fundamental questions: what are hadrons made up of and how does Quantum Chromo-dynamics (QCD), the strong-interaction component of the Standard Model, produce them? Although these questions are simple, the answers may not be. To address these questions, spectroscopy is a valuable and time-honored tool, as it enables us to understand the structure of mesons, baryons and exotics and how they are produced. In this context, the recent discovery of many new hadronic states, in particular the plethora of observed X, Y, Z states, is exciting, as these objects challenge the commonplace view of hadrons as either quark-antiquark or three-quark color-singlet states.

Experimental investigations of the hadron structure and spectrum are performed via hadron-hadron scattering processes, photo- and electro-production by nucleons or, more recently, by means of heavy-meson decays at world-wide accelerator facilities. In the last decade, these investigations have yielded an enormous amount of data, which have vastly improved our knowledge of the baryon and meson spectrum and enabled us to establish the existence of new states, together with an empirical determination of their angular momentum, content, and spin. Recent highlights are observations of multi-quark states outside our well-known hadronic pictures, which have been interpreted as the long sought-after penta- and tetraquark systems.

However, identifying new states and their quantum numbers requires complex analysis (so-called partial wave analysis), which sometimes relies on model assumptions. For many of the new states, we still do not know the quantum numbers. Different theoretical models for the structure of the new states give different predictions of their quantum numbers.  Therefore, the composition of many states remains controversial.  Indeed, some of these newly discovered hadrons seem to fit the picture of compact multi-quark states, while others may qualify as molecular states or both, i.e. the superposition of a constituent-quark core and a meson cloud, and one of the main goals of this workshop will be to discuss how to distinguish them.

Dark Matter and Stars: Multi-Messenger Probes of Dark Matter and Modified Gravity

The International Conference “Dark Matter and Stars: Multi-Messenger Probes of Dark Matter and Modified Gravity” aims to bring together scientists working across the different research fields of astrophysics, cosmology, and modified gravity. We want to look at the dark matter problem from different perspectives, considering it to be of particle nature, as well as modification of gravity. This meeting is intended to initiate cross-field discussions of dark matter searches, their current status, and future prospects.

CONFERENCE TOPICS

  • Dark matter in compact stars (neutron stars, white dwarfs, exotic stars)
  • Multi-messenger and gravitational wave probes of dark matter
  • Models of dark matter
  • Cosmology
  • Modified gravity

We seek to encourage dialogue between different research groups to enhance collaboration and help to improve our understanding of dark matter. The conference is also planned to introduce the dark matter research field to encourage attendance by young scientists including Ph.D. students.

The meeting will be held at the Centro de Congressos, Center for Astrophysics and Gravitation, Instituto Superior Técnico, University of Lisbon, Portugal.

PARTICIPANTS

Registration for the conference is free of charge. Maximum attendance is 120, to ensure all participants are comfortable and have ample opportunities to interact with one another. The selection of the final participant list is the responsibility of the organizing committee. Participants are chosen according to availability and conference goals, as explained above.

Majorana neutrinos remain at large

Majorana Demonstrator cryostat

Neutrinoless double-beta decay (0νββ) remains as elusive as ever, following publication of the final results from the Majorana Demonstrator experiment at SURF, South Dakota, in February. Based on six years’ monitoring of ultrapure 76Ge crystals, corresponding to an exposure of 64.5 kg × yr, the collaboration has confirmed that the half-life of 0νββ in this isotope is greater than 8.3 × 1025 years. This translates to an upper limit of an effective neutrino mass mββ of 113–269 meV, and complements a number of other 0νββ experiments that have recently concluded data-taking. 

Whereas double-beta decay is known to occur in several nuclides, its neutrinoless counterpart is forbidden by the Standard Model. That’s because it involves the simultaneous decay of two neutrons into two protons with the emission of two electrons and no neutrinos, which is only possible if neutrinos and antineutrinos are identical “Majorana” particles such that the two neutrinos from the decay cancel each other out. Such a process would violate lepton-number conservation, possibly playing a role in the matter–antimatter asymmetry in the universe, and be a direct sign of new physics. The discovery that neutrinos have mass, which is a necessary condition for them to be Majorana particles, motivated experiments worldwide to search for 0νββ in a variety of candidate nuclei.

Germanium-based detectors have an excellent energy resolution, which is key to be able to resolve the energy of the electrons emitted in potential 0νββ decays. The Majorana Demonstrator is also located 1.5 km underground, with low-noise electronics and ultrapure in-house-grown electroformed copper surrounding the detectors to shield it from background events. Despite a lower exposure, the collaboration was able to achieve similar limits to the GERDA experiment at Gran Sasso National Laboratory, which set a lower limit on the 76Ge 0νββ half-life of 1.8 × 1026 yr. Also among the projects of the collaboration is an ongoing search for the influence of dark-matter particles in the decay of metastable 180mTa – nature’s rarest isotope. Although no hints have been found so far, the search has already improved the sensitivity of dark-matter searches in nuclei significantly. 

The search has already improved the sensitivity of dark-matter searches in nuclei significantly

Other experiments, such as KamLAND- ZEN and EXO-200, use 136Xe to search for 0νββ. While the former recently set the most stringent limit of 2.3 × 1026 yr and is ongoing, the latter arrived at a value of 3.5 × 1025 yr with a total 136Xe exposure of 234.1 kg × yr based on its full dataset. Searches at Gran Sasso with CUORE using 1t × yr exposure of 130Te led to a half-life of 2.2 × 1025 yr and at CUORE’s successor, CUPID-0, which used 82Se with a total exposure of 8.82 kg × yr, of the order 1023 yr.

Having demonstrated the required sensitivity for 0νββ detection in 76Ge, the designs of Majorana Demonstrator and GERDA have been incorporated into the next-generation experiment LEGEND-200, which uses high-purity germanium detectors surrounded by liquid argon. The experiment, based at Gran Sasso, started operations last spring and could have initial results later this year, says co-spokesperson Steven Elliot (LANL): “Once all the detectors are installed, we plan to run for five years, while the next stage, LEGEND-1000, is proceeding through the DOE Critical Decision process. We hope to begin construction in summer 2026, with first data available early next decade.”

LHCb sees evidence for a new tetraquark state

LHCb figure 1

Half a century since its inception, quantum chromodynamics (QCD) continues to prove itself as the correct description of the strong interaction between quarks and gluons. At low energies, however, perturbative calculations in QCD are not possible. Therefore, understanding the properties of hadrons usually requires the development of phenomenological models.

The study of exotic hadrons made up of more than three quarks offers a powerful way to gain a deeper understanding of the non-perturbative behaviour of QCD. The LHC has so far discovered no fewer than 23 new exotic hadrons, most of which were first observed by the LHCb experiment. In March 2021 the LHCb collaboration reported the observation of two tetraquarks with the c c u s quark content – named Tθψs1(4000)+ and Tθψs1(4220)+ – in the decay B+ J/ψφK+. Now, based on a study of the isospin-symmetry-related decay B0 J/ψφK0S using a sample of about 2000 candidate events, the collaboration has found evidence for a new tetraquark state, Tθψs1(4000)0, with a minimal quark content c c d s . The name of the new state follows a convention introduced by LHCb in 2022 to help simplify the exotic- hadron vista.

The Tθψs1(4000)0 state was found as a resonance in the J/ψK0S mass spectrum through an amplitude analysis, and is characterised as a horizontal band in the Dalitz plot (figure 1). Imposing isospin symmetry for all intermediate states except for the Tθψs1(4000)+/0 in the two B-meson decays, the signal significance is measured to be 4.0σ. The mass and width are found to be equal to those of the Tθψs1(4000)+ within uncertainties, which is consistent with the new state being an isospin partner of the Tθψs1(4000)0. If isospin symmetry between the two states is further applied, the Tθψs1(4000)0 significance increases to 5.4σ.

The Tθψs1(4000)+ and Tθψs1(4000)0 states are not the only pair of isospin partners of hidden-charm tetraquark candidates with strangeness. Recently the BESIII collaboration reported signals of the Tψs(3985)+ and Tψs(3985)0 states with a minimal quark content of c c u s and c c d s , respectively. Although the tetraquark candidates seen by BESIII and LHCb have similar masses, the natural widths measured by each experiment are significantly different, indicating that they are distinct states.

Further studies and theoretical inputs are needed to determine the inner structure of such hidden-charm tetraquark candidates, for example whether they are compact tetraquarks, hadron molecules or produced due to kinematic effects. Despite continuous efforts, the detailed mechanisms responsible for binding multi-quark states have remained mysterious. With the start of LHC Run 3 and a new upgraded detector, the LHCb collaboration can look forward to finding further exotic states that shed light on the low-energy behaviour of QCD relevant to hadronic matter.

Strong coupling probed beyond the TeV scale

ATLAS figure 1

Due to asymptotic freedom, the theory of quantum chromodynamics (QCD), which describes the strong interaction acting on quarks and gluons, becomes weaker at short-distance (high-energy) scales. The value of the strong coupling constant αs at different scales can be determined using experimental observables that characterise the geometrical distribution of the outgoing hadrons from particle collisions. Among such observables are energy–energy correlations, which are obtained by computing the angular difference between all the possible pairs of final-state particles, weighted by the product of the normalised energies of the two particles involved. 

Precision tests

Energy–energy correlations had a significant impact on early precision tests of QCD, such as those performed at the PETRA e+e collider at DESY, where the gluon was discovered, and on the determination of αs. At hadron colliders such as the LHC, where the longitudinal momentum of the colliding partons is unknown, longitudinally invariant quantities are defined instead. The transverse energy–energy correlation (TEEC) function is defined as the transverse energy-weighted azimuthal-angular distribution of jet pairs, covering the full cos(φ) range, where φ is the azimuthal angular difference between the two jets. The TEEC distribution peaks at the edges of the cos(φ) range due to self-correlations, i.e. the angle between a jet and itself, and due to di-jet back-to-back configurations arising from momentum conservation between both jets in the transverse plane. Moreover, due to additional gluon radiation, a central plateau whose height is sensitive to the value of αs arises.

ATLAS has recently measured the TEEC distributions in multi-jet events and used them to determine αs. The analysis is performed using the full data sample (139 fb–1) of proton–proton collisions at a centre-of-mass energy of 13 TeV recorded during LHC Run 2. The measurements are presented in bins of the scalar sum of the transverse momenta of the two leading jets in the collision event (HT2), and are corrected for detector effects. By fitting theoretical calculations to the experimental data, the value of αs is determined in different kinematic regions, thereby testing the running of the strong coupling strength at high energy scales.

The results are 30% more precise than previous ATLAS measurements at 7 and 8 TeV, with a total systematic uncertainty of the order of 2.5% for the TEEC functions. The level of precision used for the theoretical predictions is equally important for the extraction of αs as that of the data. Finite predictions as a function of the value of αs(mZ), where mZ is the Z-boson mass, are calculated at next-to-next-to-leading order (NNLO) in perturbative QCD for three-jet configurations and corrected for non-perturbative effects, reducing the theoretical uncertainties in the central plateau from 6% at next-to-leading order (NLO) down to 2% for the NNLO predictions.

ATLAS figure 2

The ratio of data to theoretical prediction for the TEEC functions in the various HT2 bins is shown in figure 1. The overall description of the shape at NNLO is found to be in agreement with the data, thus confirming our understanding of strong interactions over a large range of momentum transfers. Values of αs are determined from individual fits of the theoretical predictions to data in each HT2 bin. In addition, a global fit to the measured TEEC distributions results in αs(mZ) = 0.1175 ± 0.0001 (stat.) ± 0.0006 (syst.) +0.0034 –0.0017 (theo.), where the theoretical uncertainty is dominated by the scale uncertainties that estimate the contribution of higher-order corrections. All extracted values of αs agree with the recent world average. 

This result represents the first αs determination with NNLO accuracy in three-jet production. Figure 2 compares the fitted values of αs with previous determinations from ATLAS and other experiments as a function of the momentum scale. The measurements clearly exhibit asymptotic freedom, and a significant improvement in precision compared to previous measurements.

Multi-strange production constrains hadronisation

ALICE figure 1

One of the fundamental questions in quantum chromodynamics is how hadrons are produced in high-energy collisions. More specifically: what determines the relative production rates of baryons, which contain three valence quarks, and mesons, which consist of a quark and an antiquark?

In electron–positron, electron–proton and proton–proton collisions, where a small number of particles is produced, the hadronisation process is expected to be universal: it does not depend on the type of colliding particles, but only on the parent quark or gluon. It has been found, however, that in proton–proton and proton–lead collisions at LHC energies, the baryon-to-meson ratios p/π, Λ/KS0 and Λc/D0 are significantly larger than in e+e and ep collisions. This enhancement, at intermediate transverse momentum pT (1–5 GeV), in small systems is qualitatively similar to that observed in heavy-ion collisions, e.g. between lead ions. However, in heavy-ion collisions this behaviour has been related to the interplay between hadronisation and the creation and expansion of a hot and dense quark–gluon plasma (QGP) via so-called quark recombination involving partons from within the QGP.

To shed light on hadron-production mechanisms in collisions at the LHC, ALICE has performed a novel study of the production of strange and multi-strange hadrons inside and outside energetic jets in proton–proton and proton–lead collisions. The method separates particles produced in association with a hard scattering process (within jets) from those produced in soft processes with low momentum transfer that dominate the underlying event.

The results show that the strange baryon-to-meson Λ/KS0 ratio enhancement seen in the inclusive measurement is absent within the jet cone and restricted to soft-particle production processes outside the jet cone (figure 1, left). On the other hand, the multi- strange to single-strange hyperon yield ratio Ω±/Λ (figure 1, right) shows a similar pT dependence in the interval 2 < pT < 5 GeV inside and outside the jet cone, while different behaviour is observed for the Ξ±/Λ ratio (not shown). This suggests that the baryon production mechanism also depends on the strangeness content of the baryons; the production of hyperons in jets becomes similar to that in the underlying event for baryons with large strangeness content.

These measurements will help to improve the modelling of particle production at the LHC

These measurements provide new input to understand the relative contribution of soft and hard processes to multi-strange hadron production, and thus will help to improve the modelling of particle production at the LHC. To illustrate how, two calculations with different hadronisation models are shown in the figure: one model includes string formation beyond leading colour (CR-BLC, blue band), while the other includes colour ropes (red dashed line). While one of the models describes the Λ/KS0 ratio inside jets, the other shows a better agreement with the Ω±/Λ ratio.

ALICE has also investigated the multiplicity dependence of strange baryon-to-meson and baryon-to-baryon ratios in jets in proton–lead collisions and compared it with those in proton–proton collisions. Within the current experimental precision, no difference between the two collision systems, nor a dependence on the event multiplicity, is observed. These studies will further benefit from the increased precision that will be achieved with the substantially larger data samples from ongoing LHC runs.

Innovation on show for future ep/eA colliders

Following the publication of an updated conceptual design report in 2021, CERN continues to support studies for the proposed electron–hadron colliders LHeC and FCC-eh as potential options for the future, and to provide input to the next update of the European strategy for particle physics, with emphasis on FCC. LHeC would require the LHC to be modified, while FCC-eh is a possible operational mode of the proposed Future Circular Collider at CERN. A key factor in studies for a possible future “ep/eA” collider is power consumption, for which researchers around the world are exploring the use of energy recovery linacs at the high-energy frontier.

The ep/eA programme finds itself at a crossroad between nuclear and particle physics, with synergies with astroparticle physics. It has the potential to empower the High-Luminosity LHC (HL-LHC) physics programme in a unique way, and allows for a deeper exploration of the electroweak and strong sectors of the Standard Model beyond what can be achieved with proton–proton collisions alone. In many cases, adding LHeC to HL-LHC data can significantly improve the precision of Higgs-boson measurements – similar to the improvements expected when moving from the LHC to HL-LHC.

The innovative spirit of the ep/eA community was demonstrated during the workshop “Electrons for the LHC – LHeC/FCCeh and PERLE” held at IJCLab from 26 to 28 October. As the ep/eA community moves from the former HERA facility and from the Electron-Ion Collider, currently under construction at Brookhaven, to higher energies at LHeC and FCC-eh, the threshold will be reached to study electroweak, top and Higgs physics in deep-inelastic scattering (DIS) processes for the first time. In addition, these programmes enable the exploration of the low Bjorken-x frontier orders of magnitude beyond current DIS results. At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces. In recent years, particle physicists have learned that the ultimate precision for Higgs-boson physics lies in the complementarity of e+e, pp and ep collisions, as embedded in the FCC programme, for example. Exploiting this complementarity is key to exploring new territories via the Higgs and dark sectors, as well as zooming in on potential anomalies in our data.

The October workshop underlined the advantage of a joint ep/pp/eA/AA/pA interaction experiment, and the need to further document its added scientific value. For example, a precision of 1 MeV on the W-boson mass could be within reach. In short, the ep data allows constraints to be placed on the most important systematic uncertainty when measuring the W-boson mass with pp data.

Reduced power 

Participants also addressed how to reduce the power consumption of LHeC and FCC-eh. PERLE, an expanding international collaboration revolving around a multi-turn demonstrator facility being pursued at IJCLab in Orsay for energy recovery linacs (ERLs) at high beam currents, is ready to become Europe’s leading centre for developing and testing sustainable accelerating systems.  

At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces

As demonstrated at the workshop, with additional R&D on ERLs and ep colliders we might be able to further reduce the power consumption of the LHeC (and FCC-eh) to as low as 50 MW. These values are to be compared with the GW power consumption if there was no energy recovery and therefore provide a power-economic avenue to extend the Higgs precision frontier beyond the HL-LHC. ERLs are not uniquely applicable to eA colliders, but have been discussed for future linear and circular e+e colliders too. With PERLE and other sustainable accelerating systems, the ep/eA programme has the ambition to deliver a demonstration of ERL technology at high beam current, potentially towards options for an ERL-based Higgs factory.

Workshop participants are engaged to further develop an ep/eA programme with the ability to significantly enrich this overall strategy with a view to finding cracks in the Standard Model and/or finding new phenomena that further our understanding of nature at the smallest and largest scales.

Lost in the landscape

What is string theory?

I take a view that a lot of my colleagues will not be too happy with. String theory is a very precise mathematical structure, so precise that many mathematicians have won Fields medals by making contributions that were string-theory motivated. It’s supersymmetric. It exists in flat or anti-de Sitter space (that is, a space–time with a negative curvature in the absence of matter or energy). And although we may not understand it fully at present, there does appear to be an exact mathematical structure there. I call that string theory with a capital “S”, and I can tell you with 100% confidence that we don’t live in that world. And then there’s string theory with a small “s” – you might call it string-inspired theory, or think of it as expanding the boundaries of this very precise theory in ways that we don’t know how to at present. We don’t know with any precision how to expand the boundaries into non-supersymmetric string theory or de Sitter space, for example, so we make guesses. The string landscape is one such guess. It’s not based on absolutely precise capital-S string theory, but on some conjectures about what this expanded small-s string theory might be. I guess my prejudice is that some expanded version of string theory is probably the right theory to describe particle physics. But it’s an expanded version, it’s not supersymmetric. Everything we do in anti-de-Sitter-space string theory is based on the assumption of absolute perfect supersymmetry. Without that, the models we investigate are rather speculative. 

How has the lack of supersymmetric discoveries at the LHC impacted your thinking?

All of the string theories we know about with any precision are exactly supersymmetric. So if supersymmetry is broken at the weak scale or beyond, it doesn’t help because we’re still facing a world that is not exactly supersymmetric. This only gets worse as we find out that supersymmetry doesn’t seem to even govern the world at the weak scale. It doesn’t even seem to govern it at the TeV scale. But that, I think, is secondary. The first primary fact is that the world is not exactly supersymmetric and string theory with a capital S is. So where are we? Who knows! But it’s exciting to be in a situation where there is confusion. Anything that can be said about how string theory can be precisely expanded beyond the supersymmetric bounds would be very interesting. 

What led you to coin the string theory “landscape” in 2003? 

A variety of things, among them the work of other people, in particular Polchinski and Bousso, who conjectured that string theories have a huge number of solutions and possible behaviours. This was a consequence, later articulated in a 2003 paper abbreviated “KKLT” after its authors, of the innumerable (initial estimates put it at more than 10500) different ways the additional dimensions of string theory can be hidden or “compactified”. Each solution has different properties, coupling constants, particle spectra and so forth. And they describe different kinds of universes. This was something of a shock and a surprise; not that string theory has many solutions, but that the numbers of these possibilities could be so enormous, and that among those possibilities were worlds with parameters, in particular the cosmological constant, which formed a discretuum as opposed to a continuum. From one point of view that’s troubling because some of us, me less than others, had hoped there was some kind of uniqueness to the solutions of string theory. Maybe there was a small number of solutions and among them we would find the world that we live in, but instead we found this huge number of possibilities in which almost anything could be found. On the other hand, we knew that the parameters of our world are unusual, exceptional, fine-tuned – not generic, but very special. And if the string landscape could say that there would be solutions containing the peculiar numbers that we face in physics, that was interesting. Another motivation came from cosmology: we knew on the basis of cosmic-microwave-background experiments and other things that the portion of the universe we see is very flat, implying that it is only a small part of the total. Together with the peculiar fine-tunings of the numbers in physics, it all fitted a pattern: the spectrum of possibilities would not only be large, but the spectrum of things we could find in the much bigger universe that would be implied by inflation and the flatness of the universe might just include all of these various possibilities. 

So that’s how anthropic reasoning entered the picture?

All this fits together well with the anthropic principle – the idea that the patterns of coupling constants and particle spectra were conditioned on our own existence. Weinberg was very influential in putting forward the idea that the anthropic principle might explain a lot of things. But at that time, and probably still now, many people hated the idea. It’s a speculation or conjecture that the world works this way. The one thing I learned over the course of my career is not to underestimate the potential for surprises. Surprises will happen, patterns that look like they fit together so nicely turn out to be just an illusion. This could happen here, but at the moment I would say the best explanation for the patterns we see in cosmology and particle physics is a very diverse landscape of possibilities and an extremely large universe – a multiverse, if you like – that somehow manifests all of these possibilities in different places. Is it possible that it’s wrong? Oh yes! We might just discover that this very logical, compelling set of arguments is not technically right and we have to go in some other direction. Witten, who had negative thoughts about the anthropic idea, eventually gave up and accepted that it seems to be the best possibility. And I think that’s probably true for a lot of other people. But it can’t have the ultimate influence that a real theory with quantitative predictions can have. At present it’s a set of ideas that fit together and are somewhat compelling, but unfortunately nobody really knows how to use this in a technical way to be able to precisely confirm it. That hasn’t changed in 20 years. In the meantime, theoretical physicists have gone off in the important direction of quantum gravity and holography. 

Possible string-theory solutions

What do you mean by holography in the string-theory context?

Holography predates the idea of the landscape. It was based on Bekenstein’s observation that the entropy of a black hole is proportional to the area of the horizon and not the volume of the black hole. It conjectures that the 3D world of ordinary experience is an image of reality coded on a distant 2D surface. A few years after the holographic principle was first conjectured, two precise versions of it were discovered; so called M(atrix) theory in 1996 and Maldacena’s “AdS/CFT” correspondence in 1997. The latter has been especially informative. It holds that there is a holographic duality between anti-de Sitter space formulated in terms of string theory, and quantum field theories that are similar to those that describe elementary particles. I don’t think string theory and holography are inconsistent with each other. String theory is a quantum theory that contains gravity, and all quantum mechanical gravity theories have to be holographic. String theory and holographic theory could well be the same thing. 

Almost anything we learn will be a large fraction of what we know

One of the things that troubles me about the standard model of cosmology, with inflation and a positive cosmological constant, is that the world, or at least the portion of it that we see, is de Sitter space. We do not have a good quantum understanding of de Sitter space. If we ultimately learn that de Sitter space is impossible, that would be very interesting. We are in a situation now that is similar to 20 years ago, where very little progress has been made in the quantum foundations of cosmology and in particular in the so-called measurement problem, where we don’t know how to use these ideas quantitatively to make predictions. 

What does the measurement problem have to do with it? 

The usual methodology of physics, in particular quantum mechanics, is to imagine systems that are outside the systems we are studying. We call these systems observers, apparatuses or measuring devices, and we sort of divide the world into those measuring devices and the things we’re interested in. But it’s quite clear that in the world of cosmology/de Sitter space/eternal inflation, that we’re all part of the same thing. And I think that’s partly why we are having trouble understanding the quantum mechanics of these things. In AdS/CFT, it’s perfectly logical to think about observers outside the system or observers on the boundary. But in de Sitter space there is no boundary; there’s only everything that’s inside the de Sitter space. And we don’t really understand the foundations or the methodology of how to think about a quantum world from the inside. What we’re really lacking is the kind of precise examples we have in the context of anti-de Sitter space, which we can analyse. This is something I’ve been looking for, as have many others including Witten, without much success. So that’s the downside: we don’t know very much.

What about the upsides? 

The upside is that almost anything we learn will be a large fraction of what we know. So there’s potential for great developments by simply understanding a few things about the quantum mechanics of de Sitter space. When I talk about this to some of my young friends, they say that de Sitter space is too hard. They are afraid of it. People have been burned over the years by trying to understand inflation, eternal inflation, de Sitter space, etc, so it’s much safer to work on anti-de Sitter space. My answer to that is: yes, you’re right, but it’s also true that a huge amount is known about anti-de Sitter space and it’s hard to find new things that haven’t been said before, whereas in de Sitter space the opposite is true. We will see, or at least the young people will see. I am getting to the point where it is hard to absorb new ideas.

To what extent can the “swampland” programme constrain the landscape?

The swampland is a good idea. It’s the idea that you can write down all sorts of naive semi-classical theories with practically infinite options, but that the consistency with quantum mechanics constrains the things that are possible, and those that violate the constraints are called the swampland. For example, the idea that there can’t be exact global symmetries in a quantum theory of gravity, so any theory you write down that has gravity and has a global symmetry in it, without having a corresponding gauge symmetry, will be in the swampland. The weak-gravity conjecture, which enables you to say something about the relative strengths of gauge forces and gravity acting on certain particles, is another good idea. It’s good to try to separate those things you can write down from a semi-classical point of view and those that are constrained by whatever the principles of quantum gravity are. The detailed example of the cosmological constant I am much less impressed by. The argument seems to be: let’s put a constraint on parameters in cosmology so that we can put de Sitter space in the swampland. But the world looks very much like de Sitter space, so I don’t understand the argument and I suspect people are wrong here.

What have been the most important and/or surprising physics results in your career?

I had one big negative surprise, as did much of the community. This was a while ago when the idea of “technicolour” – a dynamical way to break electroweak symmetry via new gauge interactions – turned out to be wrong. Everybody I knew was absolutely convinced that technicolour was right, and it wasn’t. I was surprised and shocked. As for positive surprises, I think it’s the whole collection of ideas called “it from qubit”. This has shown us that quantum mechanics and gravity are much more closely entangled with each other than we ever thought, and that the apparent difficulty in unifying them was because they were already unified; so to separate and then try to put them back together using the quantisation technique was wrong. Quantum mechanics and gravity are so closely related that in some sense they’re almost the same thing. I think that’s the message from the past 20 – and in particular the past 10 – years of it–from-qubit physics, which has largely been dominated by people like Maldacena and a whole group of younger physicists. This intimate connection between entanglement and spatial structure – the whole holographic and “ER equals EPR” ideas – is very bold. It has given people the ability to understand Hawking radiation, among other things, which I find extremely exciting. But as I said, and this is not always stated, in order to have real confidence in the results, it all ultimately rests on the assumption of theories that have exact supersymmetry. 

What are the near-term prospects to empirically test these ideas?

One extremely interesting idea is “quantum gravity in the lab” – the idea that it is possible to construct systems, for example a large sphere of material engineered to support surface excitations that look like conformal field theory, and then to see if that system describes a bulk world with gravity. There are already signs that this is true. For example, the recent claim, involving Google, that two entangled quantum computers have been used to send information through the analogue of a wormhole shows how the methods of gravity can influence the way quantum communication is viewed. It’s a sign that quantum mechanics and gravity
are not so different.

Do you have a view about which collider should follow the LHC? 

You know, I haven’t done real particle physics for a long time. Colliders fall into two categories: high-precision e+e colliders and high-energy proton–proton ones. So the question is: do we need a precision Higgs factory at the TeV scale or do we want to search for new phenomena at higher energies? My prejudice is the latter. I’ve always been a “slam ‘em together and see what comes out” sort of physicist. Analysing high-precision data is always more clouded. But I sure wouldn’t like anyone to take my advice on this too seriously.

τ-lepton polarisation measured in Z-boson decays

CMS figure 1

Precision electroweak measurements are a powerful way to probe new physics, through the indirect effects predicted by quantum field theory. The electroweak mixing angle θWeff is particularly sensitive to new phenomena related to electroweak symmetry breaking and the Brout–Englert–Higgs mechanism. It was measured at LEP in different processes; and at the LHC, thanks to the large number of collected events with Z-boson decays, the experiments can probe these effects with comparable sensitivity.

The CMS collaboration has reported a new measurement of the tau-lepton polarisation in the decay of Z bosons to a pair of tau leptons in proton–proton collisions at 13 TeV. The polarisation is defined as the asymmetry between the cross sections for the production of τ with positive and negative helicities, and is directly related to the electroweak mixing angle via the relation Pτ ≈ –2(1–4 sin2θWeff). The polarisation of the tau lepton is determined from the angular distributions of the visible tau decay products, leptonic or hadronic, with respect to the τ flight direction or relative to each other. A so-called optimal polarisation observable is constructed using all of these angular properties of the tau decay products. Since the spin states in Z0ττ+ are almost 100% anti-correlated, the sensitivity is improved by combining the spin observables of both τ leptons of the pair.

CMS figure 2

The average polarisationPτis obtained by a template fit to the observed optimal τ-polarisation observables, using tau-lepton pairs with an invariant mass in the range 75–120 GeV. As summarised in figure 1, the best sensitivity of Pτ is found in the channel where one tau decays to a muon and the other decays hadronically, thanks to the good selection efficiency and reconstruction of the spin observable in this channel. The fully hadronic final state suffers from higher trigger thresholds, which lead to fewer events and distortions of the templates.

The average τ polarisation is corrected to the value at the Z pole, Pτ (Z0) = –0.144 ± 0.006 (stat.) ± 0.014 (syst.), where the systematic uncertainty is dominated by the incorrect identification of the products of hadronically decaying tau leptons. The effective weak mixing angle is then determined as sin2θWeff= 0.2319 ± 0.0019, in agreement with the Standard Model (SM) prediction. Figure 2 compares the tau– lepton asymmetry parameter (the negative of the polarisation) with results from previous experiments, demonstrating that the CMS measurement is nearly as precise as those of single LEP experiments.

This measurement shows that LHC collision events, although much more complex than those collected at LEP, can provide precise determinations of the polarisation of the τ lepton, as well as of spin correlations between τ-lepton pairs. Such measurements are crucial to probe the CP properties of the Higgs boson’s Yukawa coupling to τ leptons, which is an important step in the path to understand the Higgs sector of the SM.

bright-rec iop pub iop-science physcis connect