Topics

Hypertriton lifetime puzzle nears resolution

Fig. 1.

Hypernuclei are bound states of nucleons and hyperons. Studying their properties is one of the best ways to investigate hyperon–nucleon interactions and offers insights into the high-density inner cores of neutron stars, which favour the creation of the exotic nuclear states. Constraining such astrophysical models requires detailed knowledge of hyperon–nucleon and three-body hyperon–nucleon–nucleon interactions. The strengths of these interactions can be determined in collider experiments by precisely measuring the lifetimes of hypernuclei.

Hypernuclei are produced in significant quantities in heavy-ion collisions at LHC energies. The lightest, the hypertriton, is a bound state of a proton, a neutron and a Λ. With a Λ-separation energy of only ~130 keV, the average distance between the Λ and the deuteron core is 10.6 fm. This relatively large separation implies only a small perturbation to the Λ wavefunction inside the hypernucleus, and therefore a hypertriton lifetime close to that of a free Λ, 263.2 ± 2.0 ps. Most calculations predict the hypertriton lifetime to be in the range 213 to 256 ps.

The measured lifetimes were systematically below theoretical predictions

The first measurements of the hypertriton lifetime were performed in the 1960s and 1970s with imaging techniques such as photographic emulsions and bubble chambers, and were based on very small event samples, leading to large statistical uncertainties. In the last decade, however, measurements have been performed using the larger data samples of heavy-ion collisions. Though compatible with theory, the measured lifetimes were systematically below theoretical predictions: thus the so-called “lifetime puzzle”.

The ALICE collaboration has recently reported a new measurement of the hypertriton lifetime using Pb–Pb collisions at √sNN = 5.02 TeV, which were collected in 2015. The lifetime of the (anti-)hypertriton is determined by reconstructing the two-body decay channel with a charged pion, namely 3ΛH 3 He + π (3Λ̅ H3He + π+). The branching ratio of this decay channel, taken from the theoretical calculations, is 25%. The measured lifetime is 242+34–38(stat) ± 17 (syst) ps. This result shows an improved statistical resolution and reduced systematic uncertainty compared to previous measurements and is currently the most precise measurement. It is also in agreement with both theoretical predictions and the free-Λ lifetime, even within the statistical uncertainty. Combining this ALICE result with previous measurements gives a weighted average of 206+15–13ps (figure 1).

This result represents an important step forward in solving the longstanding hypertriton lifetime puzzle, since it is the first measurement with a large data sample that is close to theoretical expectations. Larger and more precise data sets are expected to be collected during LHC Runs 3 and 4, following the ongoing major upgrade of ALICE. This will allow a significant improvement in the quality of the present lifetime measurement, and the determination of the Λ binding energy with high precision. The combination of these two measurements has the potential to constrain the branching ratio for this decay, which cannot be determined directly without access to the neutral and non-mesonic decay channels. This will be a crucial step towards understanding if the now partially confirmed theoretical description of the hypertriton is finally resolved.

Flavour heavyweights converge on Ljubljana

The international conference devoted to b-hadron physics at frontier machines, Beauty 2019, was held in Ljubljana, Slovenia, from 30 September to 4 October. The aims of the conference series are to review the latest results in heavy-flavour physics and discuss future directions. This year’s edition, the 18th in the series, attracted around 80 scientists and 65 invited talks, of which 13 were theory based.

The study of hadrons containing beauty quarks, and other heavy flavours, offers a powerful way to probe for physics beyond the Standard Model, as highlighted in the inspiring opening talk by Chris Quigg (Fermilab). In the last few years much attention has been focused on b-physics results that do not show perfect agreement with the predictions of the theory. In particular, studies by Belle, BaBar and LHCb of the processes B→K+ and B0 →K*+ (where ℓ± indicates a lepton) in specific kinematic regions have yielded different decay rates for muon pairs and electron pairs, apparently violating lepton universality. For both processes the significance of the effect is around 2.5σ. Popular models to explain this and related effects include leptoquarks and new Z’ bosons, however no firm conclusions can be drawn until more precise measurements are available, which should be the case when the next Beauty meeting occurs.

Indications that φs is nonzero are starting to emerge

The B system is an ideal laboratory for the study of CP violation, and recent results were presented by the LHC experiments for φs – the phase associated with time-dependent measurements of Bs meson decays to CP eigenstates. Indications that φs is nonzero are starting to emerge, which is remarkable given that its magnitude in the Standard Model is less than 0.1 radians. This is great encouragement for Run 3 of the LHC, and beyond.

Heavy-flavour experiments are also well suited to the study of hadron spectroscopy. Many very recent results were shown at the conference including the discovery of the X(3842), which is a charmonium resonance above the open charm threshold, and new excited resonances seen in the Λbππ final state, which help map out the relatively unexplored world of b-baryons. The ATLAS collaboration presented, for the first time, an analysis of Λb→J/ψpK decays in which a structure is observed that is compatible with that of the LHCb pentaquark discovery of 2015, providing the first confirmation by another experiment of these highly exotic states.

Beyond beauty

The Beauty conference welcomes reports on flavour studies beyond b-physics, and a highlight of the week was the first presentation at a conference of new results on the measurement of the branching ratio of the ultra-rare decay K+→π+νν̄, by the NA62 collaboration. The impressive background suppression that the experiment has achieved left the audience in no doubt as to the sensitivity of the result that can be expected when the full data set is accumulated and analysed. Comparing the measurement with the predicted branching fraction of ~10-10 will be a critical test of the Standard Model in the flavour domain.

Flavour physics has a bright future. Several talks presented the first signals and results from the early running of the Belle II experiment, and precise and exciting measurements can be expected when the next meeting in the Beauty series takes place. In parallel, studies with increasing sensitivity will continue to emerge from the LHC. The meeting was updated about progress on the LHCb upgrade, which is currently being installed ready for Run 3, and will allow for an order of magnitude increase in b-hadron samples. The conference was summarised by Patrick Koppenburg (Nikhef), who emphasised the enormous potential of b-hadron studies for uncovering signs of new physics beyond the Standard Model.

The next edition of Beauty will take place in Japan, hosted by Kavli IPMU, University of Tokyo, in autumn 2020.

Debut for baryons in flavour puzzle

LHCb has launched a new offensive in the exploration of lepton-flavour universality – the principle that the weak interaction couples to electrons, muons and tau leptons equally. Following previous results that hinted that e+e pairs might be produced at a greater rate than μ+μ pairs in B-meson decays involving the b→sℓ+ transition (ℓ=e,μ), the study brings b-baryon decays to bear on the subject for the first time.

“LHCb certainly deserves to be congratulated on this nontrivial measurement,” said Jure Zupan of the University of Cincinnati, in the US. “It is very important that LHCb is trying to measure the same quark level transition b→sℓ+ with as many hadronic probes as possible. Though baryon decays are more difficult to interpret, the Standard Model prediction of equal rates is very clean and any significant deviation would mean the discovery of new physics.”

We are living in exciting but somewhat confusing times

Jure Zupan

The current intrigue began in 2014, when LHCb observed the ratio of B+→K+μ+μ to B+→K+e+e decays to be 2.6σ below unity – the so-called RK anomaly. The measurement was updated this year to be closer to unity, but with reduced errors the significance of the deviation – either a muon deficit or an electron surplus – remains almost unchanged at 2.5σ. The puzzle deepened in 2017 when LHCb measured the rate of B0→K*0μ+μ relative to B0→K*0e+e to be more than 2σ below unity in two adjacent kinematic bins – the RK* anomaly. In the same period, measurements of decays to D mesons by LHCb and the B-factory experiments BaBar and Belle consistently hinted that the b→cℓν̄ transition might occur at a greater rate for tau leptons relative to electrons and muons than expected in the Standard Model.

Baryons enter the fray

Now, in a preprint published on 18 December, the LHCb collaboration reports a measurement of the ratio of branching fractions for the highly suppressed baryonic decays Λb0→pKe+e and Λb0→pKμ+μ to be RpK-1 = 1.17+0.18-0.16 (stat) ± 0.07 (syst). The reciprocal ratio to that reported for the B-meson decays, the measurement is consistent with previous LHCb measurements in that it errs on the side of fewer b→sμ+μ than b→se+e transitions, though with no statistical significance for that hypothesis at the present time. The blind analysis was performed for an invariant mass squared of the lepton pairs ranging from 0.1 to 6.0 GeV2 – well below contributions from resonant J/ψ→ℓ+, with observations of the latter reaction used to drive down systematics related to the different experimental treatment of muons and electrons. J/ψ meson decays to μ+μ and e+e pairs are known to respect lepton universality at the 0.4% level.

“It’s very satisfying to have been able to make this lepton-flavour universality test with baryons – having access to the Run 2 data was key,” said analyst Yasmine Amhis of the Laboratoire de l’Accélérateur Linéaire in Orsay. The analysis, which also constitutes the first observation of the decay Λb0→pKe+e, exploits an integrated 4.7 fb-1 of data collected at 7, 8 and 13 TeV. “LHCb is also working on other tests of the flavour anomalies, such as an angular analysis of B0→K*0μ+μ, and updates of the lepton-flavour universality tests of RK and RK* to the full Run 2 dataset,” continued Amhis. “We’re excited to find out whether the pattern of anomalies stays or fades away.”

We’re excited to find out whether the pattern of anomalies stays or fades away

Yasmine Amhis

An important verification of the B-meson anomalies will be performed by the recently launched Belle II experiment, though it is not expected to weigh in on Λb0 decays, says Zupan. “I think it is fair to say that it is only after both Belle II and LHCb are able to confirm the anomalies that new physics will be established,” he says. “Right now, we are living in exciting but somewhat confusing times: is the neutral-current b→sℓ+ anomaly real? Is the charged-current b→cℓν̄ anomaly real? Are they connected? Only time will tell.”

Zooming in on top quarks

Fig. 1.

As the heaviest known particle, the top quark plays a unique role in the Standard Model (SM), making its presence felt in corrections to the masses of the W and Higgs bosons, and also, perhaps, in as-yet unseen physics beyond the SM. During Run 2 of the Large Hadron Collider (LHC), high-luminosity proton beams were collided at a centre-of-mass energy of 13 TeV. This allowed ATLAS to record and study an unprecedented number of collisions producing top–antitop pairs, providing ATLAS physicists with a unique opportunity to gain insights into the top quark’s properties.

ATLAS has measured the top–antitop production cross-section using events where one top quark decays to an electron, a neutrino and a bottom quark, and the other to a muon, a neutrino and a bottom quark. The striking eμ signature gives a clean and almost background-free sample, leading to a result with an uncertainty of only 2.4%, which is the most precise top-quark pair-production measurement to date. The measurement provides information on the top quark’s mass, and can be used to improve our knowledge of the parton distribution functions describing the internal structure of the proton. The kinematic distributions of the leptons produced in top-quark decays have also been precisely measured, providing a benchmark to test programs that model top-quark production and decay at the LHC (figure 1).

Fig. 2.

The mass of the top quark is a fundamental parameter of the SM, which impacts precision calculations of certain quantum corrections. It can be measured kinematically through the reconstruction of the top quark’s decay products. The top quark decays via the weak interaction as a free particle, but the resulting bottom quark interacts with other particles produced in the collision and eventually emerges as a collimated “b-jet” of hadrons. Modelling this process and calibrating the jet measurement in the detector limits the precision in many top-quark mass measurements, however, 20% of the b-jets contain a muon that carries information relating to the parent bottom quark. By combining this muon with an isolated lepton from a W-boson originating from the same top-quark decay, ATLAS has made a new measurement of the top quark mass with a much-reduced dependence on jet modelling and calibration. The result is ATLAS’s most precise individual top-quark mass measurement to date: 174.48 ± 0.78 GeV.

Higher order QCD diagrams translate this imbalance into the charge asymmetry

At the LHC, top and antitop quarks are not produced fully symmetrically with respect to the proton-beam direction, with top antiquarks produced slightly more often at large angles to the beam, and top quarks, which receive more momentum from the colliding proton, emerging closer to the axis. Higher order QCD diagrams translate this imbalance into the so-called charge asymmetry, which the SM predicts to be small (~0.6%), but which could be enhanced, or even suppressed, by new physics processes interfering with the known production modes. Using its full Run-2 data sample, ATLAS finds evidence of charge asymmetry in top-quark pair events with a significance of four standard deviations, confidently showing that the asymmetry is indeed non-zero. The measured charge asymmetry of 0.0060 ± 0.0015 is compatible with the latest SM predictions. ATLAS also measured the charge asymmetry versus the mass of the top–antitop system, further probing the SM (figure 2).

ALICE probes extreme electromagnetic fields

When two lead nuclei collide in the LHC at an energy of a few TeV per nucleon, an extremely strong magnetic field of the order 1014 –1015 T is generated by the spectator protons, which pass by the collision zone without breaking apart in inelastic collisions. The strongest yet probed by scientists, this magnetic field, and in particular the rate at which it decays, is interesting to study since it probes unexplored properties of the quark–gluon plasma (QGP), such as its electric conductivity. In addition, chiral phenomena such as the chiral magnetic effect are expected to be induced by the strong fields. Left–right asymmetry in the production of negatively and positively charged particles relative to the collision reaction plane is one of the observables that is directly sensitive to electromagnetic fields. This asymmetry, called directed flow (v1), is sensitive to two main competing effects: the Lorentz force experienced by charged particles (quarks) propagating in the magnetic field, and the Faraday effect – the quark current that is induced by the rapidly decreasing magnetic field. Charm quarks are produced in the early stages of heavy-ion collisions and are therefore more strongly affected by the electromagnetic fields than lighter quarks.

An extremely strong magnetic field of the order 1014 –1015 T is generated

The ALICE collaboration has recently probed this effect by measuring the directed flow, v1, for charged hadrons and D0/D0 mesons as a function of pseudorapidity (η) in mid-central lead–lead collisions at √sNN = 5.02 TeV. Head-on (most central) collisions were excluded from the analyses because in those collisions there are very few spectator nucleons (almost all nucleons interact inelastically), which leads to a weaker magnetic field.

ALICE extreme electromagnetic fields directed flow

The top-left panel of the figure shows the η dependence of v1 for charged hadrons (centrality class 5–40%). The difference Δv1 between positively and negatively charged hadrons is shown in the bottom-left panel. The η slope is found to be dΔv1/dη = 1.68 ± 0.49 (stat) ± 0.41 (syst) × 10–4   – positive at 2.6σ significance. This measurement has a similar order of magnitude to recent model calculations of the expected effect for charged pions, but with the opposite sign.

The right-hand panels show the same analysis for the neutral charmed mesons D0 (cū) and D0 (c̄u) (centrality class 10–40%). The measured directed flows are found to be about three orders of magnitude larger than for the charged hadrons, reflecting the stronger fields experienced immediately after the collision when the charm quarks are created. The slopes, which are seen to be positive for D0 and negative for D0, are opposite and larger than in the model calculations. The slope of the differences in the directed flows is dΔv1/dη = 4.9 ± 1.7 (stat) ± 0.6 (syst) × 10–1 – positive at 2.7σ significance (lower-right panel). Also, in this case, the sign of the observed slope is opposite with respect to model calculations, suggesting that the relative contributions of the Lorentz and Faraday effects in those calculations are not correct.

Together with recent observations at RHIC, these LHC measurements provide an intriguing first sign of the effect of the large magnetic fields experienced in heavy-ion collisions on final-state particles. Measurements with larger data samples in Run 3 will have a precision sufficient to allow the contributions of the Lorentz force and the Faraday effect to be separated.

CMS goes scouting for dark photons

Fig. 1.

One of the best strategies for searching for new physics in the TeV regime is to look for the decays of new particles. The CMS collaboration has searched in the dilepton channel for particles with masses above a few hundred GeV since the start of LHC data taking. Thanks to newly developed triggers, the searches are now being extended to the more difficult lower range of masses. A promising possible addition to the Standard Model (SM) that could exist in this mass range is the dark photon (ZD). Its coupling with SM particles and production rate depend on the value of a kinetic mixing coefficient ε, and the resulting strength of the interaction of the ZD with ordinary matter may be several orders of magnitude weaker than the electroweak interaction.

The CMS collaboration has recently presented results of a search for a narrow resonance decaying to a pair of muons in the mass range from 11.5 to 200 GeV. This search looks for a strikingly sharp peak on top of a smooth dimuon mass spectrum that arises mainly from the Drell–Yan process. At masses below approximately 40 GeV, conventional triggers are the main limitation for this analysis as the thresholds on the muon transverse momenta (pT), which are applied online to reduce the rate of events saved for offline analysis, introduce a significant kinematic acceptance loss, as evident from the red curve in figure 1.

Fig. 2.

A dedicated set of high-rate dimuon “scouting” triggers, with some additional kinematic constraints on the dimuon system and significantly lower muon pT thresholds, was deployed during Run 2 to overcome this limitation. Only a minimal amount of high-level information from the online reconstruction is stored for the selected events. The reduced event size allows significantly higher trigger rates, up to two orders of magnitude higher than the standard muon triggers. The green curve in figure 1 shows the dimuon invariant mass distribution obtained from data collected with the scouting triggers. The increase in kinematic acceptance for low masses can be well appreciated.

The full data sets collected with the muon scouting and standard dimuon triggers during Run 2 are used to probe masses below 45 GeV, and between 45 and 200 GeV, respectively, excluding the mass range from 75 to 110 GeV where Z-boson production dominates. No significant resonant peaks are observed, and limits are set on ε2 at 90% confidence as a function of the ZD mass (figure 2). These are among the world’s most stringent constraints on dark photons in this mass range.

Rarest strange decay shrinks from sight

Fig. 1.

For every trillion K0S, only five are expected to decay to two muons. Like the better known Bs → μ+ μ decay, which was first observed jointly by LHCb and CMS in 2013, the decay rate is very sensitive to possible contributions from yet-to-be discovered particles that are too heavy to be observed directly at the LHC, such as leptoquarks or supersymmetric partners. These particles could significantly enhance the decay rate, up to existing experimental limits, but could also suppress it via quantum interference with the Standard Model (SM) amplitude.

Despite the unprecedented K0S production rate at the LHC, searching for K0S → μ+μ is challenging due to the low transverse momentum of the two muons, typically of a few hundred MeV/c. Though primarily designed for the study of heavy-flavour particles, LHCb’s unique ability to select low transverse-momentum muons in real time makes the search feasible. According to SM predictions, just two signal events are expected in the Run-2 data, potentially making this the rarest decay ever recorded.

The analysis uses two machine-learning tools: one to discriminate muons from pions, and another to discriminate signal candidates from the so-called combinatorial background that arises from coincidental decays. Additionally, a detailed and data-driven map of the detector material around the interaction point helps to reduce the “fixed-target” background caused by particles interacting with the detector material. A background of K0S → π+π decays dominates the selection, and in the absence of a compelling signal, an upper limit to the branching fraction of 2.1 × 10–10 has been set at 90% confidence. This is approximately four times more stringent than the previous world-best limit, set by LHCb with Run-1 data. This result has implications for physics models with leptoquarks and some fine-tuned regions of the Minimal Supersym­metric SM.

The upgraded LHCb detector, scheduled to begin operating in 2021 after the present long shutdown of the LHC, will offer excellent opportunities to improve the precision of this search and eventually find a signal. In addition to the increased luminosity, the LHCb upgrade will have a full software trigger, which is expected to significantly improve the signal efficiency for K0S → μ+μ and other decays with very soft final-state particles.

MAGIC spots epic gamma-ray burst

Gamma-ray bursts (GRBs) are the brightest electromagnetic events in the universe since the Big Bang. First detected in 1967, GRBs have been observed about once per day using a range of instruments, allowing astrophysicists to gain a deeper understanding of their origin. As often happens, 14 January 2019 saw the detection of three GRBs. While the first two were not of particular interest, the unprecedented energy of photons emitted by the third – measured by the MAGIC telescopes — provides a new insight into these mysterious phenomena.

The study of GRBs is unique, both because GRBs occur at random locations and times and because each GRB has different time characteristics and energy spectra. GRBs consist of two phases: a prompt phase, lasting from hundreds of milliseconds to hundreds of seconds, which consists of one or several bright bursts of hard X-rays and gamma-rays; followed by a significantly weaker “afterglow” phase which can be observed at lower energies ranging from radio to X-rays and lasts for periods up to months.

The recent detection adds yet another messenger: TeV photons

Since the late 1990, optical observations have confirmed both that GRBs happen in other galaxies and that longer duration GRBs tend to be associated with supernovae, strongly hinting that they result from the death of massive stars. Shorter GRBs, meanwhile, have recently been shown to be the result of neutron-star mergers thanks to the first joint observations of a GRB with a gravitational wave event in 2017. While this event is often regarded as the start of multi-messenger astrophysics, the recent detection of GRB190114C lying 4.5 billion light years from Earth adds yet another messenger to the field of GRB astrophysics: TeV photons.

HST GRB190114C

The MAGIC telescopes on the island of La Palma measure Cherenkov radiation produced when TeV photons induce electromagnetic showers after interacting with the Earth’s atmosphere. During the past 15 years, MAGIC has discovered a range of astrophysical sources via their emission at these extreme energies. However, detecting the emission from GRBs, despite over 100 attempts, remained elusive despite theoretical predictions that such emission could exist.

On 14 January, based on an alert provided by space-based gamma-ray detectors, the MAGIC telescopes started repointing within a few tens of seconds of the onset of the GRB. Within the next half hour, the telescopes had observed around a 1000 high energy photons from the source. This emission, which has long been predicted by theorists, is shown by the collaboration to be the result of the “synchrotron self-Compton” process, whereby high-energy electrons accelerated in the initial violent explosion interact with magnetic fields produced by the collision between these ejecta and interstellar matter. The synchrotron emission from this interaction produces the afterglow observed at X-ray, optical and radio energies. However, some of these synchrotron photons subsequently undergo inverse Compton scattering with the same electrons, allowing them to reach TeV energies. These measurements by MAGIC show for the first time that indeed this mechanism does occur. Given the many observations in the past where it wasn’t observed, it appears to be yet another feature which differs between GRBs.

The MAGIC results were published in an issue of Nature which also reported a discovery of similar emission in a different GRB by another Cherenkov telescope: the High Energy Stereoscopic System (H.E.S.S) in Namibia. While the measurements are consistent, it is interesting to note that the measurements by H.E.S.S were made ten hours after that particular GRB, showing that this type of emission can occur also at much later time scales. With two new large-scale Cherenkov observatories – the Large High Altitude Air Shower Observatory in China and the global Cherenkov Telescope Array — about to commence data taking, the field of GRB astrophysics can now expect a range of new discoveries.

When twistors met loops

Loop Quantum Gravity and Twistor Theory have a lot in common. They both have quantum gravity as a main objective, they both discard conventional spacetime as the cornerstone of physics, and they have both taken major inspiration from renowned British mathematician Roger Penrose. Interaction between the two communities has been minimal so far, however, due to their distinct research styles: mathematically oriented in Twistor Theory, but focused on empirical support in Loop Gravity. This separation was addressed in the first week of September at a conference held at the Centre for Mathematical Researches (CIRM) at the Campus of Luminy in Marseille, where about a hundred researchers converged for lively debates designed to encouraged cross-fertilisation between the two research lines.

Both Twistor Theory and Loop Gravity regard conventional smooth general-relativistic spacetime as an approximate and emerging notion. Twistor theory was proposed by Roger Penrose as a general geometric framework for physics, with the long-term aim of unifying general relativity and quantum mechanics. The main idea of the theory is to work on the null rays, namely the space of the possible path that a light ray can follow in spacetime, instead of the manifold of the points of physical spacetime. Spacetime points, or events, are then seen as derived objects: they are given by compact holomorphic curves in a complex three-fold: twistor space. It is remarkable how much the main equations of fundamental physics simplify when formulated in these terms. The mathematics of twistors has roots in the 19th century Klein correspondence in projective geometry, and modern Twistor Theory has had a strong impact on pure mathematics, from differential geometry and representation theory to gauge theories and integrable systems.

Could allying twistors and loops be dangerous?

Loop gravity, on the other hand, is a background-independent theory of quantum gravity. That is, it does not treat spacetime as the background on which physics happens, but rather as a dynamical entity itself satisfying quantum theory. The conventional smooth general relativistic spacetime emerges in the classical (ℏ→0) limit, in the same manner as a smooth electromagnetic field satisfying the Maxwell equations emerges from the Fock space of the photons in the classical limit of quantum electrodynamics. Similarly, the full dynamics of classical general relativity is recovered from the quantum dynamics of Loop gravity in the suitable limit. The transitions amplitudes of the theory are finite in the ultraviolet and are expressed as multiple integrals over non compact groups. The theory provides a compelling picture of quantum spacetime. A basis in the Hilbert space of the theory is described by the mathematics of the spin networks: graphs with links labelled by SU(2) irreducible representations, independently introduced by Roger Penrose in the early 1970s in an attempt to a fully discrete combinatorial picture of quantum physical space. Current applications of Loop Gravity include early cosmology, where the possibility of a bounce replacing the Big Bang has been extensively studied using Loop gravity methods, and black holes, where the theory’s amplitudes can be used to study the non-perturbative transition at the end of the Hawking evaporation.

The communities working in Twistors and Loops share technical tools and conceptual pillars, but have evolved independently for many years, with different methods and different intermediate goals.  But recent developments discussed at the Marseille conference saw twistors appearing in formulations of the loop gravity amplitudes, confirming the fertility and the versatility of the twistor idea, and raising intriguing questions about possible deeper relations between the two theories.

The conference was a remarkable success. It is not easy to communicate across research programs in contemporary fundamental physics, because a good part of the field is stalled in communities blocked by conflicting assumptions, ingrained prejudices and seldom questioned judgments, making understanding one another difficult. The vibrant atmosphere of the Marseille conference cut through this.

The best moment came during Roger Penrose’s talk. Towards the end of a long and dense presentation of new ideas towards understanding the full space of the solutions of Einstein’s theory using twistors, Roger said rather dramatically that now he was going to present a new big idea that might lead to the twistor version of the full Einstein equations  – but at that precise moment the slide projector exploded in a cloud of smoke, with sparks flying. We all thought for a moment that a secret power of the Universe, worried about being unmasked, had interfered. Could allying twistors and loops be dangerous?

Gauge–gravity duality opens new horizons

What, in a nutshell, did you uncover in your famous 1997 work, which became the most cited in high-energy physics?

Juan Maldacena

The paper conjectured a relation between certain quantum field theories and gravity theories. The idea was that a strongly coupled quantum system can generate complex quantum states that have an equivalent description in terms of a gravity theory (or a string theory) in a higher dimensional space. The paper considered special theories that have lots of symmetries, including scale invariance, conformal invariance and supersymmetry, and the fact that those symmetries were present on both sides of the relationship was one of the pieces of evidence for the conjecture. The main argument relating the two descriptions involved objects that appear in string theory called D-branes, which are a type of soliton. Polchinski had previously given a very precise description for the dynamics of D-branes. At low energies a soliton can be described by its centre-of-mass position: if you have N solitons you will have N positions. With D-branes it is the same, except that when they coincide there is a non-Abelian SU(N) gauge symmetry that relates these positions. So this low-energy theory resembles the theory of quantum chromodynamics, except that with N colours and special matter content.

On the other hand, these D-brane solitons also have a gravitational description, found earlier by Horowitz and Strominger, in which they look like “black branes” – objects similar to black holes but extended along certain spatial directions. The conjecture was simply that these two descriptions should be equivalent. The gravitational description becomes simple when N and the effective coupling are very large.

Did you stumble across the duality, or had you set out to find it?

It was based on previous work on the connection between D-branes and black holes. The first major result in this direction was the computation of Strominger and Vafa, who considered an extremal black hole and compared it to a collection of D-branes. By computing the number of states into which these D-branes can be arranged, they found that it matched the Bekenstein–Hawking black-hole entropy given in terms of the area of the horizon. Such black holes have zero temperature. By slightly exciting these black holes some of us were attempting to extend such results to non-zero temperatures, which allowed us to probe the dynamics of those nearly extremal black holes. Some computations gave similar answers, sometimes exactly, sometimes up to coefficients. It was clear that there was a deep relation between the two, but it was unclear what the concrete relation was. The gravity–gauge (AdS/CFT) conjecture clarified the relationship.

Are you surprised by its lasting impact?

Yes. At the time I thought that it was going to be interesting for people thinking about quantum gravity and black holes. But the applications that people found to other areas of physics continue to surprise me. It is important for understanding quantum aspects of black holes. It was also useful for understanding very strongly coupled quantum theories. Most of our intuition for quantum field theory is for weakly coupled theories, but interesting new phenomena can arise at strong coupling. These examples of strongly coupled theories can be viewed as useful calculable toy models. The art lies in extracting the right lessons from them. Some of the lessons include possible bounds on transport, a bound on chaos, etc. These applications involved a great deal of ingenuity since one has to extract the right lessons from the examples we have in order to apply them to real-world systems.

What does the gravity–gauge duality tell us about nature, given that it relates two pictures (e.g. involving different dimensionalities of space) that have not yet been shown to correspond to the physical world?

It suggests that the quantum description of spacetime can be in terms of degrees of freedom that are not localised in space. It also says that black holes are consistent with quantum mechanics, when we look at them from the outside. More recently, it was understood that when we try to describe the black-hole interior, then we find surprises. What we encounter in the interior of a black hole seems to depend on what the black hole is entangled with. At first this looks inconsistent with quantum mechanics, since we cannot influence a system through entanglement. But it is not. Standard quantum mechanics applies to the black hole as seen from the outside. But to explore the interior you have to jump in, and you cannot tell the outside observer what you encountered inside.

One of the most interesting recent lessons is the important role that entanglement plays in constructing the geometry of spacetime. This is particularly important for the black-hole interior.

I suspect that with the advent of quantum computers, it will become increasingly possible to simulate these complex quantum systems that have some features similar to gravity. This will likely lead to more surprises.

In what sense does AdS/CFT allow us to discuss the interior of a black hole?

It gives us directly a view of a black hole from the outside, more precisely a view of the black hole from very far away. In principle, from this description we should be able to understand what goes on in the interior. While there has been some progress on understanding some aspects of the interior, a full understanding is still lacking. It is important to understand that there are lots of weird possibilities for black-hole interiors. Those we get from gravitational collapse are relatively simple, but there are solutions, such as the full two-sided Schwarzschild solution, where the interior is shared between two black holes that are very far away. The full Schwarzschild solution can therefore be viewed as two entangled black holes in a particular state called the thermofield double, a suggestion made by Werner Israel in the 1970s. The idea is that by entangling two black holes we can create a geometric connection through their interiors: the black holes can be very far away, but the distance through the interior could be very short. However, the geometry is time-dependent and signals cannot go from one side to the other. The geometry inside is like a collapsing wormhole that closes off before a signal can go through. In fact, this is a necessary condition for the interpretation of these geometries as entangled states, since we cannot send signals using entanglement. Susskind and myself have emphasised this connection via the “ER=EPR” slogan. This says that EPR correlations (or entanglement) should generally give rise to some sort of “geometric” connection, or Einstein–Rosen bridge, between the two systems. The Einstein–Rosen bridge is the geometric connection between two black holes present in the full Schwarzschild solution.

Are there potential implications of this relationship for intergalactic travel?

Gao, Jafferis and Wall have shown that an interesting new feature appears when one brings two entangled black holes close to each other. Now there can be a direct interaction between the two black holes and the thermofield double state can be close to the ground state of the combined system. In this case, the geometry changes and the wormhole becomes traversable.

One can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole

In fact, as shown by Milekhin, Popov and myself, one can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole. We could construct a controllable solution only for small black holes because we needed to approximate the fermions as being massless.

If one wanted a big macroscopic wormhole where a human could travel, then it would be possible with suitable assumptions about the dark sector. We’d need a dark U(1) gauge field and a very large number of massless fermions charged under U(1). In that case, a pair of magnetically charged black holes would enable one to travel between distant places. There is one catch: the time it would take to travel, as seen by somebody who stays outside the system, would be longer than the time it takes light to go between the two mouths of the wormhole. This is good, since we expect that causality should be respected. On the other hand, due to the large warping of the spacetime in the wormhole, the time the traveller experiences could be much shorter. So it seems similar to what would be experienced by an observer that accelerates to a very high velocity and then decelerates. Here, however, the force of gravity within the wormhole is doing the acceleration and deceleration. So, in theory, you can travel with no energy cost.

How does AdS/CFT relate to broader ideas in quantum information theory and holography?

Quantum information has been playing an important role in understanding how holography (or AdS/CFT) works. One important development is a formula, due to Ryu and Takayanagi, for the fine-grained entropy of gravitational systems, such as a black hole. It is well known that the area of the horizon gives the coarse-grained, or thermodynamic, entropy of a black hole. The fine-grained entropy, by contrast, is the actual entropy of the full quantum density matrix describing the system. Surprisingly, this entropy can also be computed in terms of the area of the surface. But it is not the horizon, it is typically a surface that lies in the interior and has a minimal area. 

If you could pick any experiment to be funded and built, what would it be?

Well, I would build a higher energy collider, of say 100 TeV, to understand better the nature of the Higgs potential and look for hints of new physics. As for smaller scale experiments, I am excited about the current prospects to manipulate quantum matter and create highly entangled states that would have some of the properties that black holes are supposed to have, such as being maximally chaotic and allowing the kind of traversable wormholes described earlier.

How close are we to a unified theory of nature’s interactions?

String theory gives us a framework that can describe all the known interactions. It does not give a unique prediction, and the accommodation of a small cosmological constant is possible thanks to the large number of configurations that the internal dimensions can acquire. This whole framework is based on Kaluza–Klein compactifications of 10D string theories. It is possible that a deeper understanding of quantum gravity for cosmological solutions will give rise to a probability measure on this large set of solutions that will allow us to make more concrete predictions.Matth

Copyright © 2020 by CERN
bright-rec iop pub iop-science physcis connect