Topics

Moriond’s electroweak delights

Moriond 2024

Packed sessions, more than 100 talks and lively discussions at Rencontres de Moriond electroweak, held from 24 to 31 March in La Thuile, Italy, captured the latest thinking in the field. The Standard Model (SM) emerged intact, while new paths of enquiry were illuminated.

Twelve years after the discovery of the Higgs boson, H, a wide variety of analyses by ATLAS and CMS are bringing the new scalar into sharper focus. This includes its mass, for which CMS has reported the most precise single measurement using the H  ZZ → 4ℓ channel: 125.04 ± 0.11 (stat) ± 0.05 (syst) GeV. A Run 2 legacy mass measurement combining ATLAS and CMS results is under way, while projections for the HL-LHC indicate that an uncertainty at the 10–20 MeV level is attainable. For the H width, which is potentially highly sensitive to new physics but notoriously difficult to measure at a hadron collider, the experiments constrain its value to be less than three times the SM width at 95% confidence level using an indirect method with reasonable assumptions. A precision of about 20% is expected from the full HL-LHC dataset.

New generation

The measured H cross sections in all channels continue to support the simplest incarnation of the SM H sector, with a new result from CMS testing the bbH production mode in the ττ and WW channels. Now that the H couplings to the most massive particles are well established, the focus is moving to the second-generation fermions. Directly probing the shape of the Brout–Englert–Higgs potential, and sensitive to new-physics contributions, the H self-coupling is another key target. HH production has yet to be observed at the LHC due to its very low cross section (the combined ATLAS and CMS limit is currently 2.5–3 times the SM value), but an extensive measurement programme utilising multiple channels is under way and Moriond saw new results presented based on HH → bbbb and HH → γγττ decays (see “Homing in on the Higgs self-interaction“).

Searches for exotic H decays, or for additional low-mass scalar bosons as predicted by two-Higgs-doublet extensions to the SM, were a Moriond highlight. A wide scope of new H-boson (a, A) searches have been released by ATLAS and CMS, including a new search for H → aa → muons by CMS in the mass range 0.2–60 GeV and, on the higher mass side, new limits on H/A → tt by ATLAS and A → ZH → ℓℓ tt by CMS. Although none show significant deviations from the SM, most of the searches are statistically limited and there remains a large amount of phase space available for extended H sectors. Generating much conversation in the corridors was a new-physics interpretation of ATLAS and CMS data in terms of a Higgs-triplet model, based on results  in the HH → γγ channel and top-quark differential distributions.

The LHC experiments are making stunning progress in precision electroweak measurements, as exemplified by a new measurement by CMS of the effective leptonic electroweak mixing angle sin2θeff= 0.23157 ± 0.00031, the first LHC measurement of the W-boson width by ATLAS, and precise measurements of the W and Z cross sections at 13.6 TeV. ATLAS announced at Moriond the most precise single-experiment test of lepton-flavour universality in comparisons between W-boson decays to muons and electrons. A wide-ranging presentation of electroweak results based on two-photon collisions at the LHC described recent attempts by CMS to extract the anomalous magnetic moment of the tau lepton. And LHCb showcased its capabilities in providing an independent measurement of the W-boson mass and the Z-boson cross section. Participants heard about the increasing relevance of lattice QCD in precision electroweak measurements, for example in determining the running of alpha and the weak mixing angle. A tension between the predictions from lattice QCD and from more traditional dispersive approaches exists, with a similar origin to that for the anomalous magnetic moment of the muon.

Following the recent observation of entanglement in top-quark pairs by ATLAS and CMS, a presentation addressing the intriguing ability of colliders to carry out fundamental tests of quantum mechanics generated much discussion. Offering full access to spin information, collider experiments can study quantum correlations, wavefunction collapse and decoherence at unprecedented energies, possibly enabling a Bell measurement at the HL-LHC and the first observation of toponium.

Seeking signals from beyond

Searches for long-lived particles by ATLAS, CMS and LHCb – including the first at LHC Run 3 by CMS – were high on the Moriond agenda. Heavy gauge and scalar bosons, left–right gauge boson masses and heavy neutral leptons are among other new-physics scenarios being constrained. Casting the net as wide as possible, the LHC experiments are developing AI anomaly-detection algorithms, while the power of effective field theory (EFT) in parameterising the effect of heavy new particles on LHC measurements continues to grow via a diverse range of analyses. Even at O(6) in the SMEFT, no fewer than 59 Wilson coefficients, each related to different underlying physics processes, need to be to measured.

Neutrinoless double-beta decay, which would be an unambiguous sign of new physics, continues to be hunted by a host of experiments

Tensions between theory and experiment remain in some processes involving b → s or b → c quark transitions. Moriond saw much discussion on such processes, including new results from Belle II on the branching ratio of the highly suppressed decay B → Kνν. Participants heard about the need for theory progress, as has been the case recently with impressive calculations of b → sγ. Predictions for b → sμμ – which show a tension with experiment and that are independent of the R(K) parameters clocking the relative rates of B → +μ and B → Ke+e – are excellent ways to probe new physics. Concerning b → c transitions, updates on R(D*) from Belle II and on R(D*) and R(D) from LHCb based on the muonic decay of the tau lepton take the world-average tension to 3.17σ. The stability of the SM prediction of R(D*) was also questioned.

New flavours

The flavour sector is awash with new results. LHCb presented fresh analyses exploring mixing and CP violation in the charm sector – a unique gateway to the flavour structure of up-type quarks – while CMS presented a new measurement of CP violation in Bs→ J/ψ K+K decays. In ultra-rare kaon decays, KOTO presented a new upper limit on the branching ratio of K0L→ πνν (< 2 × 10–9 at 90% confidence level) and projects a sensitivity < 10–13 with the proposed KOTO II upgrade. NA62 presented a preliminary measurement of the branching ratio of the very rare decay π0→ e+e (5.86 ± 0.37 × 10–8), in agreement with the SM, and results for K+→ πγγ, the latter offering the first evidence that second-order terms must be included in chiral perturbation theory. Belle and Belle II showed new radiative and electroweak penguin results concerning processes such as B0→ γγ, and BESIII presented a precise measurement of the CKM matrix element Vcs. A sweeping theory perspective on the mysterious flavour structure of the SM introduced participants to “flavour modular symmetries” – a promising new game in town for a potential theory of flavour based on modular forms, which are well known in mathematics and were used in the proof of Fermat’s last theorem.

The final sessions of Moriond electro­weak turned to neutrinos, dark matter and astroparticle physics. KATRIN is soon to release an update on the neutrino mass limit based on six times more data, with an expected uncertainty of mν < 0.5 eV, and is undertaking R&D towards a proposed upgrade (KATRIN++) that would use new technology to push the mass limit down further. The collaboration is also stepping up its search for new physics via high-precision spectroscopy and is working towards an upgrade called TRISTAN that will soon zone in on the sterile neutrino hypothesis.

Rencontre at Moriond

In Japan, the T2K facility has undergone an extensive renewal period including its first operation with the near-detector ND280 upgrade in August 2023, which increased the acceptance. Designed to explore neutrino mass ordering and leptonic CP violation, T2K data so far show a slight preference for the “normal” mass ordering while admitting a CP-conserving phase at the level of 2σ. However, a joint analysis between T2K and NOvA, a neutrino oscillation experiment in the US with a longer baseline and complementary sensitivity, prefers a more degenerate parameter space where either CP conservation or the inverted ordering are acceptable solutions. The combined data place a strong constraint on Δm32.

Neutrinoless double-beta decay (NDBD), which would reveal the neutrino to be a Majorana particle and be an unambiguous sign of new physics, continues to be hunted by a host of experiments. LEGEND-200’s first physics data was shown, setting up an ultimate goal of placing a lower limit on the NDBD half-life of 1028 years for 76Ge. Also located at Gran Sasso, CUORE, which has been collecting data since 2019, will operate for one more year before an upgrade is planned. In parallel, designs for a next-generation tonne-scale upgrade, CUPID, are being finalised. Neutrino aficionados were also treated to scotogenic three-loop models, in which neutrinos gain a Dirac mass term from radiative corrections, and to the latest results from FASER at the LHC, including the first emulsion-detector measurements of the νe and νμ cross sections at TeV energies, and a search for axion-like particles.

IceCube, which studies resonant disappearance of antineutrinos due to matter effects, showed intriguing results that delve into new-physics territory. Adding sterile neutrinos improves global fits by 7σ, participants heard, but brings inconsistencies too. Generating much interest, the global p-value for the null hypothesis of the sterile neutrino in the muon disappearance channel is 3.1%, in tension with MINOS. The Deep Core IceCube upgrade will increase the number of strings in the observatory, while the more significant Gen-2 upgrade will expand its overall area. A theory overview of the status of sterile neutrinos, taking into account recent results from MiniBooNE, MicroBooNE, PROSPECT, STEREO, GALEX, SAGE, BEST and others, concluded that experimental evidence for such a fourth neutrino state is fading but not excluded. The so-called reactor anomaly is probably explained by smaller uranium contribution than previously accounted for, while the upgraded Neutrino-4 experiment will shed light on tensions with PROSPECT and STEREO.

Cosmological constraints

The status of dark photons was also reviewed. Constraints are being placed from many sources, including colliders, astrophysical and cosmological bounds, haloscopes, and most recently radio telescopes, the James Webb Space Telescope and beam-dump experiments. PandaX-4T, which seeks to constrain WIMP dark matter and NDBD, is about to restart data-taking. LZ, another large liquid-xenon detector, has placed record limits on dark matter based on its first 60 days of data-taking. Results from the first observing run of a novel kind of laser-interferometric detector, LIDA, to observe axion-like particles in the galactic halo are promising.

No particle-physics conference would be complete without the anomalous magnetic moment of the muon

The latest supersymmetry and dark-matter searches at ATLAS and CMS were also presented, including a new result on R-parity violating supersymmetry and fresh limits on the chargino mass. BESIII reported on exotic searches for massive dark photons, muon-philic particles, glueballs and the QCD axion. Searches for axion-like particles are multiplying in many shapes and forms. In terms of flavour probes of axions, the strongest bounds come from NA62. Less conventionally, probing ultralight dark matter by searching for oscillatory behaviour in gravitational waves is gaining traction. Recent NanoGrav data show no signs of such a signal.

All eyes on the muon

No contemporary particle-physics conference would be complete without the anomalous magnetic moment of the muon – a powerful quantity that takes into account all known and unknown particles, for which the measured value is in significant tension with the SM prediction. As the Fermilab Muon g-2 experiment continues to improve the experimental precision (currently 0.2 ppm), all eyes are on how the SM calculation is performed – specifically the systematic uncertainty associated with a process called hadronic vacuum polarisation. A huge amount of work is going into understanding this quantity, both in terms of the calculational machinery and underlying data used. When computed using lattice QCD, the tension between experiment and theory is significantly reduced. However, the calculations are so complex that few groups have been able to execute them. That is set to change this year, Moriond participants heard, as new lattice calculations are unblinded ahead of the Lattice 2024 meeting in August, followed by a decision on whether to include such results in the official SM prediction at the seventh plenary workshop of the Muon g-2 Theory Initiative at KEK in September.

Experimentally and theoretically, all tools are being thrown at the SM in an attempt to find an explanation for dark matter, the cosmological baryon asymmetry, neutrino masses and other outstanding mysteries. The many high-quality talks at this year’s Moriond electroweak session, including an impressive batch of flash talks in dedicated young-researcher sessions, covered all aspects of the adventure and set the standard for future analyses. An incredible interplay between astrophysical, cosmological, collider and other experimental measurements is rapidly eating into the available parameter space for new physics. Ten years ago, the Moriond theory-summary speaker remarked “new physics must be around the corner, but we see no corner”. While the same could be said today, physicists have a much clearer view of the road ahead.

High time for holographic cosmology

On the Origin of Time is an intellectually thrilling book and a worthy sequel to Stephen Hawking’s bestsellers. Thomas Hertog, who was a student and collaborator of Hawking, suggests that it may be viewed as the next book the famous scientist would have written if he were still alive. While addressing fundamental questions about the origin of the cosmos, Hertog sprinkles the text with anecdotes from his interactions with Hawking, easing up on the otherwise intense barrage of ideas and concepts. But despite its relaxed and popular style, the book will be most useful for physicists with a basic education in relativity and quantum theory.

Expanding universes

The book starts with an exhaustive journey through the history of cosmology. It reviews the ancient idea of an eternal mathematical universe, passes through the ages of Copernicus and Newton, and then enters the modern era of Einstein’s universe. Hertog thoroughly explores static and expanding universes, Hoyle’s steady-state cosmos, Hartle and Hawking’s no-boundary universe, Guth’s inflationary universe and Linde’s multiverse with eternal inflation. Everything culminates in the proposal for holographic quantum cosmology that the author developed together with the late Hawking.

What makes the book especially interesting is its philosophical reflections on the historical evolution of various underlying scientific paradigms. For example, the old Greeks developed the Platonic view that the workings of the world should be governed by eternal mathematical laws. This laid the groundwork for the reductionistic worldview that many scientists – especially particle physicists – subscribe to today.

Hertog argues that this way of thinking is flawed, especially when confronted with a Big Bang followed by a burst of inflation. Given the supremely fine-tuned structure of our universe, as is necessitated by the existence of atoms, galaxies and ultimately us, how could the universe “know” back at the time of the Big Bang that this fine-tuned world would emerge after inflation and phase transitions?

On the Origin of Time: Stephen Hawking’s Final Theory

The quest to scientifically understand this apparent intelligent design has led to physical scenarios such as eternal inflation, which produces an infinite collection of pocket universes with their own laws. These ideas blend the anthropic principle – that only a life-friendly universe can be observed – into the narrative of a multiverse.

However, for anthropic reasoning to make sense, one needs to specify what a typical observer would be, observes Hertog, because otherwise the statement is circular. Instead, he argues that one should interpret the history of the universe as an evolutionary process. Not only would physical objects continuously evolve, but also the laws that govern them, thereby building up an enormous chain of frozen accidents analogous to the evolutionary tree of biological species on Earth.

This represents a major paradigm shift as it introduces a retrospective element: one can only understand evolution by looking at it backwards in time. Deterministic and causal explanations apply only at a crude, coarse-grained level, while the precise way that structures and laws play out is governed by accumulated accidents. Essentially the question “how did everything start?” is superseded by the question “how did our universe become as it is today?” This may be seen as adopting a top-down view (into the past) instead of a bottom-up view (from the past).

Hawking criticised traditional cosmology for hiding certain assumptions, in particular the separation of the fundamental laws from initial boundary conditions and from the role of the observer. Instead, one should view the universe, at its most fundamental level, as a quantum superposition of many possible spacetimes, of which the observer is an intrinsic part.

From this Everettian viewpoint, wavefunctions behave like separate branches of reality. A measurement is like a fork in the road, where history divides into different outcomes. This line of thought has significant consequences. The author presents an illuminating analogy with the so-called delayed double-slit experiment, which was first conceived by John Archibald Wheeler. Here the measurement that determines whether an electron behaves as particle or wave is delayed until after the electron has already passed the slit. This demonstrates that the process of observation inflicts a retroactive component which, in a sense, creates the past history of the electron.

The fifth dimension 

Further ingredients are needed to transform this collection of ideas to a concrete proposal, argues Hertog. In short, these are quantum entanglement and holography. Holography has been recognised as a key property of quantum gravity, following Maldacena’s work on quantum black holes. It posits that all the information about the interior of a black hole is encoded at its horizon, which acts like a holographic screen. Inside, a fictitious fifth dimension emerges that plays the role of an energy scale.

A holographic universe would be the polar opposite of a Platonic universe with eternal laws

In Hawking and Hertog’s holographic quantum universe, one considers a Euclidean universe where the role of the holographic screen is played by the surface of our observations. The main idea is that the emergent dimension is time itself! In essence, the observed universe, with all its complexity, is like a holographic screen whose quantum bits encode its past history. Moving from the screen to the interior is equivalent to going back in time, from a highly entangled complex universe to a gradually less structured universe with fading physical laws and less entangled qubits. Eventually no entangled qubits remain. This is the origin of time as well as of the physical laws. Such a holographic universe would be the polar opposite of a Platonic universe with eternal laws.

Could these ideas be tested? Hertog argues that an observable imprint in the spectrum of primordial gravitational waves could be discovered in the future. For now, On the Origin of Time is delightful food for thought.

Muons cooled and accelerated in Japan

In a world first, a research group working at the J-PARC laboratory in Tokai, Japan, has cooled and accelerated a beam of antimatter muons (µ+). Though muon cooling was first demonstrated by the Muon Ionisation Cooling Experiment in the UK in 2020 (CERN Courier March/April 2020 p7), this is the first time that the short-lived cousins of the electron have been accelerated after cooling – an essential step for applications in particle physics.

The cooling method is ingenious – and completely different to ionisation cooling, where muons are focused in absorbers to reduce their transverse momentum. Instead, µ+ are slowed to 0.002% of the speed of light in a thin silica-aerogel target, capturing atomic electrons to form muonium, an atom-like compound of an antimatter muon and an electron. Experimenters then ionise the muonium using a laser to create a near monochromatic beam that is reaccelerated in radiofrequency (RF) cavities. The work builds on the acceleration of negative muonium ions – an antimatter muon bonded to two electrons – which the team demonstrated in 2017 (CERN Courier July/August 2018 p8).

Though the analysis is still to be finalised, with results due to be published soon, the cooling and acceleration effect is unmistakable. In accelerator physics, cooling is traditionally quantified by a reduction in beam emittance – an otherwise conserved quantity that reflects the volume occupied by the beam in the abstract space of orthogonal displacements and momenta. Estimates indicate a beam cooling effect of more than an order of magnitude, with the beam then accelerated from 25 meV to 100 keV. The main challenge is transmission. At present one antimatter muon emerges from the RF for every 10 million, which impact the aerogel. Muon decay is also a challenge given that the muonium is nearly stationary in the laboratory frame, with time dilation barely extending the muon’s 2.2 μs lifetime. Roughly a third of the µ+ decay before exiting the J-PARC apparatus.

The first application of this technology will be the muon g-2/EDM experiment at J-PARC, where data taking is due to start in 2028. The experiment will add valuable data points to measurements thought to have exceptional sensitivity to new physics (CERN Courier May/June 2021 p25). In the case of the anomalous magnetic moment (g-2) of the muon, theoretical showdowns later this year may either dissipate or reinforce intriguing hints of beyond-the-Standard-Model physics from the Muon g-2 experiment at Fermilab, potentially adding strong motivation to an independent test.

We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success

“Although our current focus is the muon g-2/EDM experiment, we are open to any possible applications of this technology in the future,” says spokesperson Tsutomu Mibe of KEK. “We are communicating with experts to understand if our technology is of any use in a muon collider, but note that our method cannot be adapted for negative muons.”

While proposals for a µ+µ+ or µ+e collider exist, a µ+µ collider remains the most strongly motivated machine. “Much of the physics interest in e+e and µ+µ colliders comes from the annihilations of the initial particles into a photon and/or a Z boson, or a Higgs boson in the case of µ+µ,” says John Ellis of CERN/KCL. “These possibilities are absent for a µ+e or µ+µ+ collider, making them less interesting in my opinion.” From an accelerator-physics perspective, it remains to be demonstrated that the technique can deliver the beam intensity needed for an energy-frontier collider – not least while keeping the emittance low.

“We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success, says International Muon Collider study leader Daniel Schulte of CERN. “This will profit the development of muon-beam technology and use. We are in contact to understand how we can collaborate.”

The next 10 years in astroparticle theory

Pulsar timing arrays

Astroparticle physics connects the extremely small with the extremely large. At the interface of particle physics, cosmology and astronomy, the field ties particles and interactions to the hot Big Bang cosmological model. This synergy allows us to go far beyond the limitations of terrestrial probes in our quest to understand nature at its most fundamental level. A typical example is neutrino masses, where cosmological observations from large-scale structure formation far exceed current bounds from terrestrial experiments. Astroparticle theory (APT) has accelerated quickly in the past 10 years. And this looks certain to continue in the next 10.

Today, neutrino masses, dark matter and the baryon asymmetry of the universe are the only evidence we have of physics beyond the Standard Model (BSM) of particle physics. Astroparticle theorists study how to extend the theory towards a new Standard Model – and the cosmological consequences of doing so.

New insights

For a long time, work on dark matter focused on TeV-scale models parallel to searches at the LHC and in ultra-low-noise detectors. The scope has now broadened to a much larger range of masses and models, from ultralight dark matter and axions to sub-GeV dark matter and WIMPs. Theoretical developments have gone hand-in-hand with new experimental opportunities. In the next 10 years, much larger detectors are planned for WIMP searches aiming towards the neutrino floor. Pioneering experimental efforts, even borrowing techniques from atomic and condensed-matter physics, test dark matter with much lower masses, providing new insights into what dark matter may be made of.

I strongly welcome efforts to broaden the reach in mass scales to efficiently hunt for any hint of what the new physics BSM may be

Neutrinos provide a complementary window on BSM physics. It is just over 25 years since the discovery of neutrino oscillation provided evidence that neutrinos have mass – a fact that cannot be accounted for in the SM (CERN Courier May/June 2024 p29). But the origin of neutrino masses remains a mystery. In the coming decade, neutrinoless double-beta decay experiments and new large experiments, such as JUNO, DUNE (see “A gold mine for neutrino physics“) and Hyper-Kamiokande, will provide a much clearer picture, determining the mass ordering and potentially discovering the neutrino’s nature and whether it violates CP symmetry. These results may, via leptogenesis, be related to the origin of the matter–antimatter asymmetry of the universe.

Recently, there has been renewed interest in models with scales accessible to current particle-physics experiments. These will exploit the powerful beams and capable detectors of the current and future experimental neutrino programme, and collider-based searches for heavy neutral leptons with MeV-to-TeV masses.

Overall, while the multi-TeV scale should continue to be a key focus for both particle and astroparticle physics experiments, I strongly welcome the theoretical and experimental efforts to broaden the reach in mass scales to efficiently hunt for any hint of what the new physics BSM may be.

Silvia Pascoli

Astroparticle physics also studies the particles that arrive on Earth from all around our universe. They come from extreme astrophysical environments, such as supernovae and active galactic nuclei, where they may be generated and accelerated to the highest energies. Thanks to their detection we can study the processes that fuel these astrophysical objects and gain an insight into their evolution (see “In defiance of cosmic-ray power laws“).

The discovery of gravitational waves (GWs) just a few years ago has shed new light on this field. Together with gamma rays, cosmic rays and the high-energy neutrinos detected at IceCube, the field of multi-messenger astronomy is in full bloom. In the coming years it will get a boost from the results of new, large experiments such as KM3Net, the Einstein Telescope, LISA and the Cherenkov Telescope Array – as well as many new theoretical developments, such as advanced particle-theory techniques for GW predictions.

In the field of GWs, last year’s results from pulsar timing arrays indicate the presence of a stochastic background of GWs. What is its origin? Is it of astrophysical nature or does it come from some dramatic event in the early universe, such as a strong first-order phase transition? In this latter case, we would be getting a glimpse of the universe when it was just born, opening up a new perspective on fundamental particles and interactions. Could it be that we have seen a new GeV-scale dark sector at work? It is too early to tell. But this is very exciting.

LHC physicists spill the beans in Boston

Dedicated solely to LHC physics, the LHCP conference is a vital gathering for experts in the field. The 12th edition was no exception, attracting 450 physicists to Northeastern University in Boston from 3 to 7 June. Participants discussed recent results, data taking at a significantly increased instantaneous luminosity in Run 3, and progress on detector upgrades planned for the high-luminosity LHC (HL-LHC).

The study of the Higgs boson remains central to the LHC programme. ATLAS reported a new result on Standard Model (SM) Higgs-boson production with decays to tau leptons, achieving the most precise single-channel measurement of the vector-boson-fusion production mode to date. Determining the production modes of the Higgs boson precisely may shed light on the existence of new physics that would be observed as deviations from the SM predictions.

Beyond single Higgs production, the di-Higgs production (HH) search is one of the most exciting and fundamental topics for LHC physics in the coming years as it directly probes the Higgs potential (see “Homing in on the Higgs self-interaction“). ATLAS has combined results for HH production in multiple final states, providing the best-expected sensitivity to the HH production cross-section and Higgs-boson self-coupling, allowing κλ (the Higgs self-coupling with respect to the SM value) to be within the range –1.2 < κλ< 7.2.

The search for beyond-the-SM (BSM) physics to explain the many unresolved questions about our universe is being conducted with innovative ideas and methods. CMS has presented new searches involving signatures with two tau leptons, examining the hypotheses of an excited tau lepton and a heavy neutral spin-1 gauge boson (Z) produced via Drell-Yan and, for the first time, via vector boson fusion. These results set stringent constraints on BSM models with enhanced couplings to third-generation fermions.

Other new-physics theoretical models propose additional BSM Higgs bosons. ATLAS presented a search for such particles being produced in association with top quarks, setting limits on their cross-section that significantly improve upon previous ATLAS  results. Additional BSM Higgs bosons could explain puzzles such as dark matter, neutrino oscillations and the observed matter–antimatter asymmetry in the universe.

The dark side

Some BSM models imply that dark-matter particles could arise as composite mesons or baryons of a new strongly-coupled theory that is an extension of the SM. ATLAS investigated this dark sector through searches for high-multiplicity hadronic final states, providing the first direct collider constraints on this model to complement direct dark-matter-detection experimental results.

CMS have used low-pileup inelastic proton–proton collisions to measure event-shape variables related to the overall distribution of charged particles. These measurements showed the particle distribution to be more isotropic than predicted by theoretical models.

LHCP conference talk

The LHC experiments also presented multiple analyses of proton–lead (p–Pb) and pp collisions, exploring the potential production of quark–gluon plasma (QGP) – a hot and dense phase of deconfined quarks and gluons found in the early universe that is frequently studied in heavy-ion Pb–Pb collisions, among others, at the LHC. Whether it can be created in smaller collision systems is still inconclusive.

ALICE reported a high-precision measurement of the elliptic flow of anti-helium-3 in QGP using the first Run-3 Pb–Pb run. The much larger data sample compared to the previous Run 2 measurement allowed ALICE to distinguish production models for these rarely produced particles for the first time. ALICE also reported the first measurement of an impact-parameter-dependent angular anisotropy in the decay of coherently photo-produced ρ0 mesons in ultra-peripheral Pb–Pb collisions. In these collisions, quantum interference effects cause a decay asymmetry that is inversely proportional to the impact parameter.

CMS reported its first measurement of the complete set of optimised CP-averaged observables from the process B0 K*0μ+μ. These measurements are significant because they could reveal indirect signs of new physics or subtle effects induced by low-energy strong interactions. By matching the current best experimental precision, CMS contributes to the ongoing investigation of this process.

LHCb presented measurements of the local and non-local contributions across the full invariant-mass spectrum of B0* K*0μ+μ, tests of lepton flavour universality in semileptonic b decays, and mixing and CP violation in D  Kπ decays.

The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations

From a theoretical perspective, progress in precision calculations has exceeded expectations. Many processes are now known to next-to-next-to-leading order or even next-to-next-to-next-to-leading order (N3LO) accuracy. The first parton distribution functions approximating N3LO accuracy have been released and reported at LHCP, and modern parton showers have set new standards in perturbative accuracy.

In addition to these advances, several new ideas and observables are being proposed. Jet substructure, for instance, is becoming a precision science and valuable tool due to its excellent theoretical properties. Effective field theory (EFT) methods are continuously refined and automated, serving as crucial bridges to new theories as many ultraviolet theories share the same EFT operators. Synergies between flavour physics, electroweak effects and high-transverse-momentum processes at colliders are particularly evident within this framework. The use of the LHC as a photon collider showcases the extraordinary versatility of LHC experiments and their synergy with theoretical advancements.

Discovery machine

The HL-LHC upgrade was thoroughly discussed, with several speakers highlighting the importance and uniqueness of its physics programme. This includes fundamental insights into the Higgs potential, vector-boson scattering, and precise measurements of the Higgs boson and other SM parameters. Thanks to the endless efforts by the four collaborations to improve their performances, the LHC already rivals historic lepton colliders for electroweak precision in many channels, despite the cleaner signatures of lepton collisions. The HL-LHC will be capable of providing extraordinarily precise measurements while also serving as a discovery machine for many years to come.

The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations. Preserving the unique expertise and knowledge cultivated within the CERN community is imperative. Next year’s LHCP conference will be held at National Taiwan University in Taipei from 5 to 10 June.

Sustainable accelerator project underway

Particle accelerators have become essential instruments to improve our health, the environment, our safety and our high-tech abilities, as well as unlocking new, fundamental insights into physics, chemistry and biology, and generally enabling scientific breakthroughs that will improve our lives. Accelerating particles to higher energies will always require a large amount of energy. In a society where energy sustainability is critical, keeping energy consumption as low as is reasonably possible is an unavoidable challenge for both research infrastructures (RIs) and industry, which collectively operate more than 40,000 accelerators.

Going green

Based on state-of-the-art technology, the portfolio of current and future accelerator-driven RIs in Europe could develop to consume up to 1% of Germany’s annual electricity demand. With the ambition to maintain the attractiveness and competitiveness of European RIs, and enable Europe’s Green Deal, the Innovate for Sustainable Accelerating Systems (iSAS) project has been approved by Horizon Europe. Its aim is to establish an enhanced collaboration in the field to broaden, expedite and amplify the development and impact of novel energy-saving technologies to accelerate particles.

In general terms, a particle accelerator has a system to create the particles to be accelerated, a system preparing beams with these particles, an accelerating system that effectively accelerates the particle beams, a magnet system to steer the beam, an experimental facility using the particles, and finally a beam dump. In linear accelerating structures, most of the electrical power taken from the grid to operate the accelerator is used by the accelerating system itself.

The core of an accelerating system is a series of cavities that can deliver a high-gradient electric field. For many modern accelerators, these cavities are superconducting and therefore cryogenically cooled to about 2 K. They are powered with radio frequency (RF) power generators to deliver the field at a specific frequency and accordingly to provide energy to the particle beams as they traverse. These superconducting RF (SRF) systems are the enabling technology for frontier accelerators, but are energy-intensive devices where only a fraction of the power extracted from the grid is effectively transmitted to the accelerated particles. In addition, the beam energy is radiated by recirculating beams and ultimately dumped and lost. As an example, the European XFEL’s superconducting RF system uses 5–6 MW for 0.1 MW of average beam power, leading to a power conversion of less than 3%.

The objective of iSAS is to innovate those technologies that have been identified as being a common core of SRF accelerating systems and that have the largest leverage for energy savings with a view to minimising the intrinsic energy consumption in all phases of operation. In the landscape of accelerator-driven RIs, solutions are being developed to reuse the waste heat produced, develop energy-efficient magnets and RF power generators, and operate facilities on opportunistic schedules when energy is available on the grid. The iSAS project has a complementary focus on the energy efficiency of the SRF accelerating technologies themselves. This will contribute to the vital transition to sustain the tremendous 20th-century applications of accelerator technology in an energy-conscious 21st century.

Interconnected technologies

Based on a recently established European R&D roadmap for accelerator technology and based on a collaboration between leading European research institutions and industry, several interconnected technologies will be developed, prototyped and tested, each enabling significant energy savings on their own in accelerating particles. The collection of energy-saving technologies will be developed with a portfolio of forthcoming applications in mind, and to explore energy-saving improvements in accelerator-driven RIs. Considering the developments realised, the new technologies will be coherently integrated into the parametric design of a new accelerating system, a linac SRF cryomodule, optimised to achieve high beam-power in accelerators with an energy consumption that is as low as reasonably possible. This new cryomodule design will enable Europe to develop and build future energy-sustainable accelerators and particle colliders.

iSAS has been approved by Horizon Europe to help develop novel energy-saving technologies to accelerate particles

On 15 and 16 April, the iSAS kick-off meeting was organised at IJCLab (Orsay, France) with around 100 participants. Each of the working groups enthusiastically presented their impactful R&D plans and, in all cases, concrete work has begun. To save energy from RF power systems, novel fast-reacting tuners are being developed to compensate rapidly for detuning of the cavity’s frequency caused by mechanical vibrations, and methods are being invented to integrate them into smart digital control systems. To save energy from the cryogenics, and based on the ongoing Horizon Europe I.FAST project, superconducting cavities with thin films of Nb3Sn are being further developed to operate with high performance at 4.2 K instead of 2 K, thereby reducing the grid-power to operate the cryogenic system. The cryogenic system requires three times less cooling power to maintain a 4.2 K bath at 4.2 K when heat is dissipated in the bath compared to maintaining a 2 K bath at 2 K. Finally, to save energy from the accelerated particle beam itself, the technology of energy recovery linacs (ERLs) is being improved to operate efficiently with high-current beams by developing novel higher-order mode dampers that significantly avoid heat loads in the cavities.

iSAS logo

To address the engineering challenges related to the integration of the new energy-saving technologies, an existing ESS cryovessel will be equipped with new cavities and novel dampers, and the resulting linac SRF cryomodule will be tested in operation in the PERLE accelerator at IJCLab (Orsay, France). PERLE is a growing international collaboration to demonstrate the performance of ERLs with high-power beams that would enable applications in future particle colliders. Its first phase is being implemented at IJCLab with the objective to have initial beams in 2028.

The timescale to innovate, prototype and test new accelerator technologies is inherently long, in some cases longer than the typical duration of R&D projects. It is therefore essential to continue to collaborate and enhance the R&D process so that energy-sustainable technologies can be implemented without delay, to avoid hampering the scientific and industrial progress enabled by accelerators. Accordingly, iSAS plans co-development with industrial partners to jointly achieve a technology readiness level that will be sufficient to enter the large-scale production phase of these new technologies.

Empowering industry

While the readiness of several energy-saving technologies will be prepared towards industrialisation with impact on current RIs, iSAS is also a pathfinder for sustainable future SRF particle accelerators and colliders. Through inter- and multidisciplinary research that delivers and combines various technologies, it is the long-term ambition of iSAS to reduce the energy footprint of SRF accelerators in future RIs by half, and even more when the systems are integrated in ERLs. Accordingly, iSAS will help maintain Europe’s leadership for breakthroughs in fundamental sciences and enable high-energy collider technology to go beyond the current frontiers of energy and intensity in an energy-sustainable way. In parallel, the new sustainable technologies will empower and stimulate European industry to conceive a portfolio of new applications and take a leading role in, for example, the semiconductor, particle therapy, security and environmental sectors.

Intrigue in charm hadronisation

ALICE figure 1

Quantum chromodynamics (QCD) is one of the pillars of the Standard Model of particle physics, but much remains to be understood about its emergent behaviours, and theoretical calculations often disagree. A new result from the ALICE collaboration has now added fresh intrigue to interpretations of hadronisation – the process by which quarks and gluons become confined inside colour-neutral groupings such as baryons and mesons.

The production of heavy charm and beauty quarks in proton–proton collisions at the LHC is a rather fast process (~7 × 10–24 s) and subject to perturbative QCD calculations. On the other hand, the transformation of heavy quarks into hadrons requires substantially more time (~3 × 10–23 s). This separation of time scales has motivated the idea that the hadronisation process of heavy quarks is independent of the colliding system and collision energy. However, the production of baryons carrying a heavy quark in proton–proton collisions at the LHC has been found to be enhanced compared to more elementary e+e collisions. This surprising finding seems to invalidate the concept of universal hadronisation of heavy quarks, which is an important basis for calculations of particle production in QCD.

A new dimension

Heavy-flavour baryons carrying charm and strange quarks add a new dimension to these measurements. Such measurements are challenging because they suffer from low production rates. Due to the short lifetime of charm baryons (typically a fraction of a picosecond), they are usually observed through the detection of their decay products. The probability of how often they decay into a particular set of daughter particles, known as the branching ratio (BR), is poorly known for many of the strange-charm baryons. Knowledge of the precise branching ratio is crucial for interpreting the production results of these baryons.

Recently, the ALICE collaboration has measured the production of Ωc0 (css) baryons via the semileptonic decay channel  Ωc0→ Ωe+νe (and its charge-conjugate modes) as a function of transverse momentum (pT) in proton–proton collisions at 13 TeV at midrapidity (|y| < 0.8). The Ωc0 candidates are built by pairing an electron or positron candidate track with an Ω baryon candidate using a Kalman Filter vertexing algorithm. The Ω candidates are reconstructed via the cascading decay chain Ω→ ΛK, followed by the decay Λ→ pπ. The missing momentum of the neutrino was corrected by using an unfolding technique. Figure 1 shows the invariant-mass distribution of the Ωc0 candidates.

ALICE figure 2

Figure 2 compiles measurements of the decay by CLEO, Belle and now ALICE. Due to the lack of an absolute BR, results are quoted relative to the BR of Ωc0→ Ωπ+. Combined with the earlier measurement of Ωc0→ Ωπ+, the relative probability of the two decay modes is obtained: BR(Ωc0→ Ωe+νe)/BR(Ωc0→ Ωπ+) = 1.12 ± 0.22 (stat.) ± 0.27 (syst.). The Belle and CLEO collaborations have measured this ratio to be 1.98 ± 0.13 (stat.) ± 0.08 (syst.) and 2.4 ± 1.1 (stat.) ± 0.2 (syst.). Model predictions using the light-front approach and light-cone sum rules predict values of 1.1 ± 0.2 and 0.71, respectively. Another approach calculates decay modes and probabilities of charmed-baryon decays based on SU(3)f flavour symmetry in the quark model, resulting in a computed branching fraction ratio of 1.35.

The ALICE result is consistent with theory calculations and is 2.3σ lower than the more precise value reported by the Belle collaboration. The present measurement provides constraints on the decay probabilities of the Ωc0 baryons. It demonstrates that such measurements are now possible at the LHC with a precision similar to that at e+e colliders.

With the ongoing Run 3 at the LHC and thanks to the recent upgrades, ALICE is on the way to collecting a data sample that is about a thousand times larger for these types of analyses, which will enable more precise measurements of other decay modes. Thanks to these data, we expect to resolve the question of universal hadronisation in the near future.

Zooming in on leptonic W decays

ATLAS figure 1

In the Standard Model of particle physics, the three charged lepton flavours couple to the electroweak gauge bosons W and Z with the same strength – an idea known as lepton flavour universality (LFU). This implies that differences in the rates of processes involving W or Z bosons together with electrons, muons and tau leptons should arise only from differences in the leptons’ masses. Experimental results agree with LFU at the 0.1–0.2% level in the decays of tau leptons, kaons and pions, but hints of deviations have been seen in B-meson decays, for example in the combination of measurements of B → D(*)τν and B → D(*)μν decays at the BaBar, Belle and LHCb experiments.

The W and Z bosons are so heavy that the probabilities for them to decay to electrons, muons and tau leptons are expected to be equal to very high precision, if LFU holds. This implies that the ratios of these probabilities such as R(μ/e), which compares W → μν and W → eν, and R(τ/μ), which compares W → τν and W → μν, should be unity. Experiments at the LEP electron–positron collider measured a surprisingly large value of R(τ/μ) = 1.070 ± 0.026, but a more precise measurement from the ATLAS collaboration at the LHC found R(τ/μ) = 0.992 ± 0.013, in agreement with LFU. This measurement made use of the large sample of top-quark pair events produced at ATLAS during Run 2 of the LHC from 2015 to 2018. These top-quark events can be cleanly selected, with each event containing two W bosons and two b-quarks produced from the decays of the top quarks.

In a new measurement, ATLAS has turned its attention to the comparison of W decays to muons and electrons, via the ratio R(μ/e). The collaboration again used top-quark pair events as a clean and copious source of W bosons. Counting the number of events with one electron from W → eν, one muon from W → μν, and one or two b-tagged jets, provides the cleanest way to measure the rate of top-quark pair production. But this rate can also be measured from the number of top-quark pair events with two electrons or two muons. If R(μ/e) = 1 and W → eν and W → μν decays occur with equal probability, the rates of such ee and μμ events should be the same, after correcting for detector efficiencies. Any difference would suggest a violation of LFU.

Some measurement uncertainties have similar effects on the ee and μμ final states, so they largely cancel in the ratio R(μ/e). However, electrons and muons behave in very different ways in the ATLAS detector, giving different detection efficiencies with differing and uncorrelated uncertainties that do not cancel in the ratio. To reduce the sensitivity of the measured R(μ/e) to these effects, the double ratio R(μ/e)/√(R(μμ/ee) was measured first, where R(μμ/ee) corresponds to the comparison of Z → μμ and Z → ee decay probabilities, determined from the same dataset. The final R(μ/e) was then obtained by making use of the very precise measurement of R(μμ/ee) from the LEP experiments and the SLD experiment at SLAC, which has an uncertainty of only 0.0028. This latter ratio acts as a calibration of the relative detection efficiencies of electrons and muons in ATLAS, reducing the associated uncertainties in R(μ/e).

The final result from this new ATLAS analysis is R(μ/e) = 0.9995 ± 0.0045, perfectly compatible with unity. The measurement is compared to previous results from LHC and LEP experiments (see figure 1). Thanks to the large data sample and careful control of all systematic uncertainties, it improves on the uncertainty of 0.006 from all previous measurements combined. At least in W decays, LFU survives intact.

CMS studies single-top production

CMS figure 1

Being the most massive known elementary particle, top quarks are a focus for precision measurements and searches for new phenomena. At the LHC, they are copiously produced in pairs via quantum chromodynamic (QCD) interactions, and, to a much lesser extent, in single modes through the electroweak force. Precisely measuring the single-top cross section provides a stringent test for the electroweak sector of the Standard Model (SM) of particle physics.

In September 2022, only four months after the start of the Run 3, the CMS collaboration released the first measurement using data at the new collision energy of 13.6 TeV: the production cross section of a top quark together with its antiparticle (tt). The collaboration can now also report a measurement of the production of a single top quark in association with a W boson (tW) based on the full dataset recorded in 2022. As well as testing the electroweak sector, constraining tW allows it to be better disentangled from the dominant tt process – a channel where precision improves our knowledge of higher orders of accuracy in perturbative QCD.

CMS figure 2

tW is a challenging measurement as it is 10 times less likely than tt production but has almost the same detection signature. This analysis selects events where both the top quark and the W boson ultimately decay to leptons. The signal therefore consists of two leptons (electrons or muons), a jet initiated from a bottom quark, and possibly extra jets coming from additional radiation. No single observable can discriminate the signal from the background, so a random forest (RF) is employed in events that contain either one or two jets, one of which comes from a bottom quark. The RF is a collection of decision trees collaborating to distinguish the tW signal from the tt background. The output of the RF, for events with one jet identified as coming from a bottom quark, is shown in figure 1. The higher the RF discriminant, the higher the relative proportion of signal events.

To achieve a higher precision, an extra handle is used to control the tt background: information from events with two b-quark jets. Such events are more likely to come from the decay of a tt pair. The measurement yields a precise value for the tW cross section. Figure 2 shows tW cross-section measurements by CMS at different centre-of-mass energies, including the new measurement in proton–proton collisions of 13.6 TeV. All measurements are consistent with state-of-the-art theory calculations. The first tW measurement at the new LHC energy frontier uses only part of the data but is already as precise as the earlier measurement, which used the entire Run 2 sample at 13 TeV. Exploiting the full Run 3 data sample will push the precision frontier forward and provide an even more stringent SM probe in the top quark sector.

LHCb squeezes D-meson mixing

LHCb figure 1

The weak force, unlike other fundamental forces, has a distinctive feature: its interactions slightly differ when involving quarks or antiquarks. This phenomenon, known as CP violation, allows for an asymmetry in the likelihood of a process occurring with matter compared to its antimatter counterpart, which is an essential requirement to explain the large dominance of matter in the universe. However, the size of CP violation predicted by the Standard Model (SM), and in accordance with experimental measurements so far, is not large enough to explain this cosmological imbalance. This is why physicists are actively searching for new sources of CP violation and striving to improve our understanding of the known ones. The phenomenology offered by the quantum-mechanical oscillations of neutral mesons into their antimatter counterparts, the antimesons, provides a particularly rich experimental ground for such studies.

The LHCb collaboration recently measured a set of parameters that determine the matter–antimatter oscillation of the neutral D0 meson into the D0 anti­meson with unprecedented precision. This enables the search for the predicted hitherto unobserved CP violation in this oscillation.

D0 mesons are composed of a charm quark and an up antiquark. Their oscillations are extremely slow, with an oscillation period over a thousand times longer than their lifetimes. As a result, only a very few D0 mesons transform before they decay. Oscillations are therefore identified as extremely small changes in the flavour mixture – matter or antimatter – as a function of the time at which the D0 or the D0 decays.

In LHCb’s analysis, the initial matter–antimatter flavour of the neutral meson is experimentally inferred from the charge of the accompanying pion in the CP-conserving decay chains D*(2010)+→ D0π+ and D*(2010)→ D0π. The mixing effect (or oscillation) then appears as a decay-time dependence of the ratio, R, of the number of “suppressed” and “favoured” decay processes of the neutral meson. The suppressed decays can occur with or without a net oscillation of the D0 meson, while the favoured decays are largely dominated by the direct process. In the absence of mixing, this ratio is predicted to be constant as a function of the D0 decay time while, in the case of mixing, it approximately follows a parabolic behaviour, increasing with time. Figure 1 shows the ratio R, including data for both matter (R+ for D0→ K+π) and antimatter (R for D0→ Kπ+) processes, and corresponding model predictions. The variation depends not only on the oscillation parameters but also on the various observables of CP violation, which differentiate between matter and antimatter.

This analysis is the most precise measurement of these parameters to date, improving the uncertainty on both mixing and CP-violating observables by a factor of 1.6 compared to the previous best result, also by LHCb. This improvement is largely due to an unpre­cedentedly large sample of about 1.6 million suppressed decays and 421 million favoured decays collected during Run 2, making LHCb unique in probing up-type quark transitions. The results confirm the matter–antimatter oscillation of the D0 meson and show no evidence of CP violation in the oscillation.

These findings call for future analyses of this and other decays of the D0 meson using data from the third and fourth run of the LHC, exploiting the potential of the currently operating detector upgrade (Upgrade I). The detector upgrade proposed for the fifth and sixth runs of the LHC (Upgrade II) would provide a six-times-bigger sample, yielding the precision needed to definitively test the predictions of the SM.

bright-rec iop pub iop-science physcis connect