Topics

How to surf to high energies

A laser ionises rubidium vapour, turning it into plasma. A proton bunch plunges inside, evolving into millimetre-long microbunches. The microbunches pull the plasma’s electrons, forming wakes in the plasma, like a speedboat displacing water. Crests and troughs of the plasma’s electric field trail the proton microbunches at almost the speed of light. If injected at just the right moment, relativistic electrons surf on the accelerating phase of the field over a distance of metres, gaining energy up to a factor of 1000 times faster than can be achieved in conventional accelerators.

Plasma wakefield acceleration is a cutting-edge technology that promises to revolutionise the field of particle acceleration by paving the way for smaller and more cost-effective linear accelerators. The technique traces back to a seminal paper published in 1979 by Toshiki Tajima and John Dawson which laid the foundations for subsequent breakthroughs. At its core, the principle involves using a driver to generate wakefields in a plasma, upon which a witness beam surfs to undergo acceleration. Since the publication of the first paper, the field has demonstrated remarkable success in achieving large accelerating gradients.

Traditionally, only laser pulses and electron bunches have been used as drive beams. However, since 2016 the Advanced Wakefield Experiment (AWAKE) at CERN has used proton bunches from the Super Proton Synchrotron (SPS) as drive beams – an innovative approach with profound implications. Thanks to their high stored energy, proton bunches enable AWAKE to accelerate an electron bunch to energies relevant for high-energy physics in a single plasma, circumventing the need for the multiple accelerating stages that are required when using lasers or electron bunches.

Bridging the divide

Relevant to any accelerator concept based on plasma wakefields, AWAKE technology promises to bridge the gap between global developments at small scales and possible future electron–positron colliders. The experiment is therefore an integral component of the European strategy for particle physics’ plasma roadmap, aiming to advance the concept to a level of technological maturity that would allow their application to particle-physics experiments. An international collaboration of approximately 100 people across 22 institutes worldwide, AWAKE has already published more than 90 papers, many in high-impact journals, alongside significant efforts to train the next generation, culminating in the completion of over 28 doctoral theses to date.

A proton bunch train

In the experiment, a 400 GeV proton bunch from the SPS is sent into a 10 m-long plasma source containing rubidium vapour at a temperature of around 200 °C (see “Rubidium source” figure). A laser pulse accompanies the proton bunch, ionising the vapour and transforming it into a plasma.

To induce the necessary wakefields, the drive bunch length must be of the order of the plasma wavelength, which corresponds to the natural oscillation period of the plasma. However, the length of the SPS proton bunch is around 6 cm, significantly longer than the 1 mm plasma wavelength in AWAKE, and short wavelengths are required to reach large accelerating gradients.

The solution is to take advantage of a beam-plasma instability, which transforms long particle bunches into microbunches with the period of the plasma through a process known as self-modulation. In other words, as the long proton bunch traverses the plasma, it can be coaxed into splitting into a train of shorter “microbunches”. The bunch train resonantly excites the plasma wave, like a pendulum or a child on a swing, being pushed with small kicks at its natural oscillation interval or resonant frequency. If applied at the right time, each kick increases the oscillation amplitude or height of the wave. When the amplitude is sufficiently high, a witness electron bunch from an external source is injected into the plasma wakefields, to ride the wakefields and gain energy.

AWAKE rubidium vapour source

The first phase of AWAKE (Run 1, from 2016 to 2018) served as a proof-of-concept demonstration of the acceleration scheme. First, it was shown that a plasma can be used as a compact device to self-modulate a highly relativistic and highly energetic proton bunch (see “Self-modulation” figure). Second, it was shown that the resulting bunch train resonantly excites strong wakefields. Third – the most direct demonstration – it was shown that externally injected electrons can be captured, focused and accelerated to GeV energies by the wakefields. The addition of a percent-level positive gradient in density along the plasma led to 20% boosts in the energy gained by the accelerated electrons.

Based on these proof-of-principle experimental results and expertise at CERN and in the collaboration, AWAKE developed a well-defined programme for Run 2, which launched in 2021 following Long Shutdown 2, and which will run for several more years from now. The goal is to achieve electron acceleration with GeV/m energy gain and beam quality similar to a normalised emittance of 10 mm-mrad and a relative energy spread of a few per cent. In parallel, scalable plasma sources are being developed that can be extended up to hundreds of metres in length (see “Helicon plasma source” and “Discharge source” figures). Once these goals are reached, the concepts of AWAKE could be used in particle-physics applications such as using electron beams with energy between 40 and 200 GeV impinging on a fixed target to search for new phenomena related to dark matter.

Controlled instability

The first Run 2 milestone, on track for completion by the end of the year, is to complete the self-modulator – the plasma that transforms the long proton bunch into a train of microbunches. The demonstration has been staged in two experimental phases.

The first phase was completed in 2022. The results prove that wakefields driven by a full proton bunch can have a reproducible and tunable timing. This is not at all a trivial demonstration given that the experiment is based on an instability!

Plasma wakefield acceleration

Techniques to tune the instability are similar to those used with free-electron lasers: provide a controlled initial signal for the instability to grow from and operate in the saturated regime, for example. In AWAKE, the self-modulation instability is initiated by the wakefields driven by an electron bunch placed ahead of the proton bunch. The wakefields from the electron bunch imprint themselves on the proton bunch right from the start, leading to a well defined bunch train. This electron bunch is distinct from the witness bunches, which are later accelerated.

The second experimental phase for the completion of the self-modulator is to demonstrate that high-amplitude wakefields can be maintained over long distances. Numerical simulations predict that self-modulation can be optimised by tailoring the plasma’s density profile. For example, introducing a step in the plasma density should lead to higher accelerating fields that can be maintained over long distances. First measurements are very encouraging, with density steps already leading to increased energy gains for externally injected electrons. Work is ongoing to globally optimise the self-modulator.

AWAKE technology promises to bridge the gap between global developments at small scales and possible future electron–positron colliders

The second experimental milestone of Run 2 will be the acceleration of an electron bunch while demonstrating its sustained beam quality. The experimental setup designed to reach this milestone includes two plasmas: a self-modulator that prepares the proton bunch train, and a second “accelerator plasma” into which an external electron bunch is injected (see “Modulation and acceleration” figure). To make space for the installation of the additional equipment, CERN will in 2025 and 2026 dismantle the CNGS (CERN Neutrinos to Gran Sasso) target area that is installed in a 100m-long tunnel cavern downstream from the AWAKE experimental facility.

Accelerate ahead

Two enabling technologies are needed to achieve high-quality electron acceleration. The first is a source and transport line to inject the electron bunch on-axis into the accelerator plasma. A radio-frequency (RF) injector source was chosen because of the maturity of the technology, though the combination of S-band and X-band structures is novel, and forms a compact accelerator with possible medical applications. It is followed by a transport line that preserves the parameters of the 150 MeV 100 pC bunch, and allows for its tight focusing (5 to 10 µm) at the entrance of the accelerator plasma. External injection into plasma-based accelerators is challenging because of the high frequency (about 235 GHz in AWAKE) and thus small structure size (roughly 200 µm) at which they operate. The main goal is to demonstrate that the electron bunch can be accelerated to 4 to 10 GeV, with a relative energy spread of 5 to 8%, and emerge with approximately the same normalised emittance as at the entrance of the plasma (2–30 mm mrad).

Prototype discharge plasma source

For these experiments, rubidium vapour sources will be used for both the self-modulator and accelerator plasmas, as they provide the uniformity, tunability and reproducibility required for the acceleration process. However, the laser-ionisation process of the rubidium vapour does not scale to lengths beyond 20 m. The alternative enabling technology is therefore a plasma source whose length can be scaled to the 50 to 100 metres required for the bunch to reach 50–100 GeV energies. To achieve this, a laboratory to develop discharge and helicon-plasma sources has been set up at CERN (see “Discharge source” figure). Multiple units can in principle be stacked to reach the desired plasma length. The challenge with such sources is to demonstrate that they can produce required plasma parameters other than length.

The third and final experimental milestone for Run 2 will then be to replace the 10 m-long accelerator plasma with a longer source and achieve proportionally larger energy gains. The AWAKE acceleration concept will then essentially be mature to propose particle-physics experiments, for example with bunches of a billion or so 50 GeV electrons.

Boosting physics with precision and intensity

The Physics Beyond Colliders (PBC) initiative has diversified the landscape of experiments at CERN by supporting smaller experiments and showcasing their capabilities. Its fifth annual workshop convened around 175 physicists from 25 to 27 March to provide updates on the ongoing projects and to explore new proposals to tackle the open questions of the Standard Model and beyond.

This year, the PBC initiative has significantly strengthened CERN’s dark-sector searches, explained Mike Lamont and Joachim Mnich, directors for accelerators and technology, and research and computing, respectively. In particular, the newly approved SHiP proton beam-dump experiment (see SHiP to chart hidden sector) will complement the searches for light dark-sector particles that are presently conducted with NA64’s versatile setup, which is suitable for electron, positron, muon and hadron beams.

First-phase success

The FASER and SND experiments, now taking data in the LHC tunnel, are two of the successes of the PBC initiative’s first phase. Both search for new physics and study high-energy neutrinos along the LHC collision axis. FASER’s successor, FASER2, promises a 10,000-fold increase in sensitivity to beyond-the-Standard Model physics, said Jonathan Feng (UC Irvine). With the potential to detect thousands of TeV-scale neutrinos a day, it could also measure parton distribution functions and thereby enhance the physics reach of the high-luminosity LHC (HL-LHC). FASER2 may form part of the proposed Forward Physics Facility, set to be located 620 m away, along a tangent from the HL-LHC’s interaction point 1. A report on the facility’s technical infrastructure is scheduled for mid-2024, with a letter of intent foreseen in early 2025. By contrast, the CODEX-b and ANUBIS experiments are being designed to search for feebly interacting particles transverse to LHCb and ATLAS, respectively. In all these endeavours, the Feebly Interacting Particle Physics Centre will act as a hub for exchanges between experiment and theory.

Francesco Terranova (Milano-Bicocca) and Marc Andre Jebramcik (CERN) explained how ENUBET and NuTAG have been combined to optimise a “tagged” neutrino beam for cross-section measurements, where the neutrino flavour is known by studying the decay process of its parent hadron. In the realm of quantum chromodynamics, SPS experiments with lead ions (the new NA60+ experiment) and light ions (NA61/SHINE) are aiming to decode the phases of nuclear matter in the non-perturbative regime. Meanwhile, AMBER is proposing to determine the charge radii of kaons and pions, and to perform meson spectroscopy, in particularwith kaons.

The LHCspin collaboration presented a plan to open a new frontier of spin physics at the LHC building upon the successful operation of the SMOG2 gas cell that is upstream of the LHCb detector. Studying collective phenomena at the LHC in this way could probe the structure of the nucleon in a so-far little-explored kinematic domain and make use of new probes such as charm mesons, said Pasquale Di Nezza (INFN Frascati).

Measuring moments

The TWOCRYST collaboration aims to demonstrate the feasibility and the performance of a possible fixed-target experiment in the LHC to measure the electric and magnetic dipole moments (EDMs and MDMs) of charmed baryons, offering a complementary probe of searches for CP violation in the Standard Model. The technique would use two bent crystals: the first to deflect protons from the beam halo onto a target, with the resulting charm baryons then deflected by the second (precession) crystal onto a detector such as LHCb, while at the same time causing their spins to precess in the strong electric and magnetic fields of the deformed crystal lattice, explained Pascal Hermes (CERN).

New ideas ranged from the measurement of molecular electric dipole moments at ISOLDE to measuring the gravitational field of the LHC beam

Several projects to detect axion-like particles were discussed, including a dedicated superconducting cavity for heterodyne detection being jointly developed by PBC and CERN’s Quantum Technology Initiative. Atom interferometry is another subject of common interest, with PBC demonstrating the technical feasibility of installing an atom interferometer with a baseline of 100 m in one of the LHC’s access shafts. Other new ideas ranged from the measurement of molecular EDMs at ISOLDE to measuring the gravitational field of the LHC beam.

With the continued determination to fully exploit the scientific potential of the CERN accelerator complex and infrastructure for projects that are complementary to high-energy-frontier colliders testified by many fruitful discussions, the annual meeting concluded as a resounding success. The PBC community ended the workshop by thanking co-founder Claude Vallée (CPPM Marseille), who retired as a PBC convener after almost a decade of integral work, and welcomed Gunar Schnell (Ikerbasque and UPV/EHU Bilbao), who will take over as convener.

Ultra-peripheral conference debuts in Mexico

Ultra-peripheral collisions (UPCs) involving heavy ions and protons represent the energy frontier for photon-induced reactions. These high-energy photons can be used to study unique features of quarks and gluons inside nuclei, and can probe electromagnetic and electroweak interactions without the usual backgrounds associated with quantum-chromodynamic processes. The first edition of the international workshop on this subject took place from 10 to 15 December 2023 in Playa del Carmen, Mexico, bringing together about 90 participants, more than a third of whom were early-career researchers. This is the first time that the international UPC community has gathered together, establishing a new international conference series on this active and expanding area of research.

The conference highlighted the impressive progress and diversity of UPC physics, which goes far beyond the initial studies of exclusive pro­-cesses. UPC23 covered the latest results from experiments at RHIC and the LHC, and prospects for the future Electron-Ion Collider (EIC) at Brookhaven National Laboratory. Discussions delved into the intricacies of inelastic photo-nuclear events, including the exciting programme of open charm that is yet to be explored, and examined how UPCs serve as a novel lens for investigating the quark–gluon plasma and other final-state nuclear effects. Lots of attention was devoted to the physics of low-x parton densities – a fundamental aspect of protons and nuclei that photons can probe in a unique way.

Enriched understanding

Among the conference’s theoretical highlights, Farid Salazar (UCLA) showed how vector–meson photoproduction could be a powerful method to detect gluon saturation across different collision systems, from proton–nucleus to electron–nucleus to UPCs. Zaki Panjsheeri (Virginia) put forth innovative ideas to study double-parton correlations, linking UPC vector–meson studies to generalised parton distributions, enhancing our understanding of the proton’s structure. Ashik Ikbal (Kent State), meanwhile, introduced exciting proposals to investigate quantum entanglement through exclusive J/ψ photoproduction at RHIC.

The conference also provided a platform for discussing the active exploration of light-by-light scattering and two-photon processes for probing fundamental physics and searches for axion-like particles, and for putting constraints on the anomalous magnetic moment of the tau lepton (see CMS closes in on tau g–2).

Energy exploration

Physicists at the LHC have effectively repurposed the world’s most powerful particle accelerator into a high-energy photon collider. This innovative approach, traditionally the domain of electron beams in colliders like LEP and HERA, and anticipated at the EIC, allows the LHC to explore photon-induced interactions at energies never before achieved. David Grund (Czech Technical University in Prague), Georgios Krintiras (Kansas) and Cesar Luiz Da Silva (Los Alamos) shared the latest LHC findings on the energy dependence of UPC J/ψ events. These results are crucial for understanding the onset of gluon saturation – a state where gluons become so dense reaching saturation, the dynamical equilibrium where the emission and recombination occurs. However, the data also align with the nuclear phenomenon known as gluon shadowing, which arises from multiple-scattering processes. David Tlusty (Creighton) presented the latest findings from the STAR Collaboration, which has recently expanded its UPC programme, complementing the energy exploration at the LHC. Klaudia Maj (AGH University of Krakow) presented the latest results on two-photon interactions and photonuclear jets from the ATLAS collaboration, including measurements that may be probing the quark-gluon plasma. 

Delegates discussed the future opportunities for UPC physics with the large integrated luminosity expected for Runs 3 and 4 at the LHC

Carlos Bertulani (Texas A&M) paid tribute to Gerhard Baur, who passed away on June 16 last year. Bertulani and Baur co-authored “Electromagnetic processes in relativistic heavy ion collisions” – a seminal paper with more than 1000 citations. Bertulani invited delegates to consider the untapped potential of UPCs in the study of anti-atoms and exotic atoms.

Delegates also discussed the future opportunities for UPC physics with the large integrated luminosity expected for Run 3 and Run 4 at the LHC, with the planned detector upgrades for Run 4 such as FoCal, the recent upgrades by STAR, the sPHENIX programme and at the EIC. Delegates are expecting event selection and instrumentation close to the beam line, for example using “zero degree” calorimeters, to offer the greatest experimental opportunities in the coming years.

The next edition of the UPC conference will take place in Saariselka, Finland in June 2025.

CERN celebrates 100 years of science and diplomacy

Since his birth in Bohemia in 1924, Herwig Schopper has been a prisoner of war, an experimentalist with pioneering contributions in nuclear, accelerator and detector physics, director general (DG) of DESY and then CERN during a golden age for particle physics, and a celebrated science diplomat. Shortly after his centenary, his colleagues, family and friends gathered on 1 March to celebrate the life of the first DG in either institution to reach 100.

“He is a restless person,” noted Albrecht Wagner (DESY), who presented a whistlestop tour of Schopper’s 35 years working in Germany, following his childhood in Bohemia. Whether in Hamburg, Erlangen, Mainz or Karlsruhe, he never missed out on an opportunity to see new places – though always maintaining the Austrian diet to which his children attribute his longevity. On one occasion, Schopper took a sabbatical to work with Lise Meitner in Stockholm’s Royal Institute of Technology. At the time, the great physicist was performing the first nuclear-physics studies in the keV range, said Wagner, and directed Schopper to measure the absorption rate of beta-decay electrons in various materials using radioactive sources and a Geiger–Müller counter. Schopper is one of the last surviving physicists to have worked with her, observed Wagner.

Schopper’s scientific contributions have included playing a major part in the world’s first polarised proton source, Europe’s first R&D programme for superconducting accelerators and the development of hadronic calorimeters as precision instruments, explained Christian Fabjan (TU Vienna/HEPHY). Schopper dubbed the latter the sampling total absorption calorimeter, or STAC, playing on the detector’s stacked design, but the name didn’t stick. In recognition of his contributions, hadronic calorimeters might now be renamed Schopper total absorption calorimeters, joked Fabjan.

As CERN DG from 1981 to 1988, Schopper oversaw the lion’s share of the construction of the LEP, before it began operations in July 1989. To accomplish this, he didn’t shy away from risks, budget cuts or unpopular opinions when the situation called for it, said Chris Llewellyn Smith, who would himself serve as DG from 1994 to 1998. Llewelyn Smith credited Schopper with making decisions that would benefit not only LEP, but also the LHC. “Watching Herwig deal with these reviews was a wonderful apprenticeship, during which I learned a lot about the management of CERN,” he recalled.

After passing CERN’s leadership to Carlo Rubbia, Schopper became a fulltime science diplomat, notably including 20 years in senior roles at UNESCO between 1997 and 2017, and significant contributions to SESAME, the Synchrotron-light for Experimental Science and Applications in the Middle East (see CERN Courier January/Feb­ruary 2023, p28). Khaled Toukan of Jordan’s Atomic Energy Commission, CERN Council president Eliezer Rabinovici and Maciej Nałecz (Polish Academy of Science, formerly of UNESCO) all spoke of Schopper’s skill in helping to develop SESAME as a blueprint for science for peace and development. “Herwig likes building rings,” Toukan fondly recounted.

As with any good birthday party, Herwig received gifts: a first copy of his biography, a NASA hoodie emblazoned with “Failure is not an option” from Sam Ting (MIT), who is closely associated with Schopper since their time together at DESY, and the Heisenberg medal. “You’ve even been in contact with the man himself,” noted Heisenberg Society president Johannes Blümer, referring to several occasions Schopper met Heisenberg at conferences and even once discussed politics with him.

Schopper continues to counsel DGs to this day – and not only on physics. Confessing to occasionally being intimidated by his lifetime of achievements, CERN DG Fabiola Gianotti intimated that they often discuss music. “Herwig likes all composers, but not baroque ones. For him, they are too rational and intellectual.” For this, he will always have physics.

Slim, charming protons on the menu in Mainz

The triennial international conference on meson–nucleon physics and the structure of the nucleon (MENU) attracted more than 140 participants to the historic centre of Mainz from 16 to 20 October 2023.

Among MENU 2023’s highlights on nucleon structure, a preliminary analysis by the NNPDF collaboration suggests that the proton contains more charm than anticharm, with Niccolò Laurenti (Università degli Studi di Milano) showing evidence of a non-vanishing intrinsic valence charm contribution to the proton’s wavefunction. Meanwhile, Michael Kohl (Hampton University) concluded that the proton–radius puzzle is still not resolved. To make progress, form-factor measurements in electron scattering must be scrutinised, and the use of atomic spectroscopy data clarified, he said.

Hadron physics

A large part of this year’s conference was dedicated to hadron spectroscopy, with updates from Belle II, BESIII, GlueX, Jefferson Lab, JPAC, KLOE/KLOE-2 and LHCb, as well as theoretical overviews covering everything from lattice quantum chromodynamics to effective-field theories. Special emphasis was also given to future directions in hadron physics at future facilities such as FAIR, the Electron-Ion Collider and the local Mainz Energy-Recovering Superconducting Accelerator (MESA) facility – a future low-energy but high-intensity electron accelerator that will make it possible to carry out experiments in nuclear astrophysics, dark-sector searches and tests of the SM. Among upgrade plans at Jefferson Lab, Eric Voutier (Paris-Saclay) presented a future experimental programme with positron beams at CEBAF, the institute’s Continuous Electron Beam Accelerator Facility. The upgrade will allow for a rich physics programme covering two-photon exchange, generalised polarisabilities, generalised parton distribution functions and direct dark-matter searches.

Highlights on nucleon structure include a preliminary analysis suggesting that the proton contains more charm than anticharm

Hadron physics is also closely related to searches for new physics, as precision observables of the Standard Model are in many cases limited by the non-perturbative regime of quantum chromodynamics. A prime example is the physics of the anomalous magnetic moment of the muon, for which a puzzling discrepancy between data-driven dispersive and lattice–quantum chromodynamics calculations of hadronic contributions to the Standard Model prediction persists (CERN Courier May/June 2021 p25). The upcoming collaboration meeting of the Muon g-2 Theory Initiative in September 2024 at KEK will provide important new insights from lattice QCD and e+e experiments. It remains to be seen whether the eventual theoretical consensus will confirm a significant deviation from the experimental value, which is currently being updated by Fermilab’s Muon g-2 experiment using their last three years of data.

A safe approach to quantum gravity

The LHC experiments at CERN have been extremely successful in verifying the Standard Model (SM) of particle physics to very high precision. From the theoretical perspective, however, this model has two conceptual shortcomings. One is that the SM appears to be an “effective field theory” that is valid up to a certain energy scale only; the other is that gravity is not part of the model. This raises the question of what a theory comprising particle physics and gravity that is valid for all energy scales might look like. This directly leads to the domain of quantum gravity.

The typical scale associated with quantum-gravity effects is the Planck scale: 1015 TeV, or 10–35 m. This exceeds the scales accessible at the LHC by approximately 14 orders of magnitude, forcing us to ask: what can theorists possibly gain from investigating physics at energies beyond the Planck scale? The answer is simple: the SM includes many free parameters that must be fixed by experimental data. Since the number of these parameters proliferates when higher order interactions are included, one would like to constrain this high-dimensional parameter space.

At low energies, this can be done by implementing bounds derived from demanding unitarity and causality of physical processes. Ideally, one would like to derive similar constraints from consistency at trans-Planckian scales where quantum-gravity effects may play a major role. At first sight, this may seem counterintuitive. It is certainly true that gravity treated as an effective field theory itself does not yield any effect measurable at LHC scales due to its weakness; the additional constraints then arise from requiring that the effective field theories underlying the SM and gravity can be combined and extended into a framework that is valid at all energy scales. Presumably, this will not work for all effective field theories. Taking a “bottom-up” approach (identifying the set of theories for which this extension is possible) may constrain the set of free parameters. Conversely, to be phenomenologically viable, any theory describing trans-Planckian physics must be compatible with existing knowledge at the scales probed by collider experiments. This “top-down” approach may then constrain the potential physics scenarios happening at the quantum-gravity scale – a trajectory that has been followed, for example, by the swampland programme initiated from string theory at all scales.

From the theoretical viewpoint, the SM is formulated in the language of relativistic quantum field theories. On this basis, it is possible that the top-down route becomes more realistic the closer the formulation of trans-Planckian physics sticks to this language. For example, string theory is a promising candidate for a consistent description of trans-Planckian physics. However, connecting the theory to the SM has proven to be very difficult, mainly due to the strong symmetry requirements underlying the formulation. In this regard, the “asymptotic safety” approach towards quantum gravity may offer a more tractable option for implementing the top-down idea since it uses the language of relativistic quantum field theory.

Asymptotic safety

What is the asymptotic-safety scenario, and how does it link quantum gravity to particle physics? Starting from the gravity side, we have a successful classical theory: Einstein’s general relativity. If one tries to upgrade this to a quantum theory, things go wrong very quickly. In the early 1970s, it was shown by Gerard ’t Hooft and Martinus Veltman that applying the perturbative quantisation techniques that have proved highly successful for particle-physics theories fail for general relativity. In short, it introduces an infinite number of parameters (one for each allowed local interaction) and thus requires an infinite number of independent measurements to determine what the values of those parameters are. Although this path leads us to a quantum theory of gravity valid at all scales, the construction lacks predictive power. Still, it results in a perfectly predictive effective field theory describing gravity up to the Planck scale.

QCD running coupling

This may seem discouraging when attempting to formulate a quantum field theory of gravity without introducing new symmetry principles, for example supersymmetry, to remove additional free parameters. A loophole is provided by Kenneth Wilson’s modern understanding of renormalisation. Here, the basic idea is to organise quantum fluctuations according to their momentum and integrate-out these fluctuations, starting from the most energetic ones and proceeding towards lower energy modes. This creates what is called the Wilsonian renormalisation-group “flow” of a theory. Healthy high-energy completions are provided by renormalisation-group fixed points. At these special points the theory becomes scale-invariant, which ensures the absence of divergences. The fixed point also provides predictive power via the condition that the renormalisation-group flow hits the fixed point at high energies (see “Safety belt” figure). For asymptotically-free theories, where all interactions switch off at high energies, the underlying renormalisation-group fixed point is the free theory. This can be seen in the example of quantum chromodynamics (QCD): if the QCD gauge coupling diminishes when going to higher and higher energies, it approaches a fixed point at arbitrary high energies that is non-interacting. One can also envision high-energy completions based on a renormalisation-group fixed point with non-vanishing interactions, which is commonly referred to as asymptotic safety.

Forces of nature

In the context of gravity, the asymptotic-safety scenario was first proposed by Steven Weinberg in the late 1970s. Starting with the seminal work by Martin Reuter (University of Mainz) in 1998, the existence of a renormalisation-group fixed point suitable for rendering gravity asymptotically safe – the so-called Reuter fixed point – is supported by a wealth of first-principle computations. While similar constructions are well known in condensed-matter physics, the Reuter fixed point is distinguished by the fact that it may provide a unified description of all forces of nature. As such, it may have profound consequences for our understanding of the physics inside a black hole, give predictions for parameters of the SM such as the Higgs-boson mass, or disfavour certain types of physics beyond the SM.

The asymptotic-safety approach towards quantum gravity may offer a more tractable option for implementing the top-down idea

The predictive power of the fixed point arises as follows. Only a finite set of parameters exist that describe consistent quantum field theories emanating from the fixed point. One then starts to systematically integrate-out quantum fluctuations (from high to low energy), resulting in a family of effective descriptions in which the quantum fluctuations are taken into account. In practice, this process is implemented by the running of the theory’s couplings, generating what are known as renormalisation-group trajectories. To be phenomenologically viable, the endpoint of the renormalisation group trajectory must be compatible with observations. In the end, only one (or potentially none) of the trajectories emanating from the fixed point will provide a description of nature (see “Going with the flow” image). According to the asymptotic-safety principle, this trajectory must be identified by fixing the free parameters left by the fixed point based on experiments. Once this process is completed, the construction fixes all couplings in the effective field theory in terms of a few free parameters. Since this entails an infinite number of relations that can be probed experimentally, the construction is falsifiable.

Particle physics link

The link to particle physics follows from the observation that the asymptotic-safety construction remains operative once gravity is supplemented by the matter fields of the SM. Non-abelian gauge groups – such as those underlying the electroweak and strong forces, Yukawa interactions and fermion masses – are readily accommodated. A wide range of proof-of-concepts show that this is feasible, gradually bringing the ultimate computation involving the full SM into reach. The fact that gravity remains interacting at the smallest length scales too implies that the construction will feature non-minimal couplings between matter and the gravitational field as well as matter self-interactions of a very specific type. The asymptotic-safety mechanism may then provide the foundation for a realistic quantum field theory unifying all fundamental forces of nature.

Can particle physics tell us whether this specific idea about quantum gravity is on the right track? After all there still exists the vast hierarchy between the energy scales probed by collider experiments and the Planck scale. Surprisingly, the answer is positive! Conceptually, the interacting renormalisation-group fixed point for the gravity–matter theory again gives a set of viable quantum field theories in terms of a fixed number of free parameters. First estimates conducted by Jan Pawlowski and coworkers at Heidelberg University suggest that this number is comparable to the number of free parameters in the SM.

3D space of couplings

In practice, one may then be tempted to make the following connection. Currently, observables probed by collider physics are derived from the SM effective field theory. Hence, they depend on the couplings of the effective field theory. The asymptotic-safety mechanism expresses these couplings in terms of the free parameters associated with the interacting fixed point. Once the SM effective field theory is extended to include operators of sufficiently high mass dimension, the asymptotic-safety dictum predicts highly non-trivial relations between the couplings parameterising the effective field theory. These relations can be confronted with observations that test whether the observables measured experimentally are subject to these constraints. This can either be provided by matching to existing particle-physics data obtained at the LHC, or by astrophysical observations probing the strong-gravity regime. The theoretical programme of deriving such relations is currently under development. A feasible benchmark, showing that the underlying physics postulates are on the right track, would then be to “post-dict” the experimental results already available. Showing that a theory formulated at the Planck scale is compatible with the SM effective field theory would be a highly non-trivial achievement in itself.

Showing that a theory formulated at the Planck scale is compatible with the SM effective field theory would be a highly non-trivial achievement in itself

This line of testing quantum gravity experimentally may be seen as orthogonal to more gravity-focused tests that attempt to decipher the quantum nature of gravity. Recent ideas in these directions have evolved around developing tabletop experiments that probe the quantum superposition of macroscopic objects at sub-millimetre scales, which could ultimately be developed into a quantum-Cavendish experiment that probes the gravitational field of source masses in spatial quantum superposition states. The emission of a graviton could then lead to decoherence effects which give hints that gravity indeed has a force carrier similar to the other fundamental forces. Of course, one could also hope that experiments probing gravity in the strong-gravity regime find deviations from general relativity. So far, this has not been the case. This is why particle physics may be a prominent and fruitful arena in which to also test quantum-gravity theories such as asymptotic safety in the future.

For decades, quantum-gravity research has been disconnected from directly relevant experimental data. As a result, the field has developed a vast variety of approaches that aim to understand the laws of physics at the Planck scale. These include canonical quantisation, string theory, the AdS/CFT correspondence, loop quantum gravity and spin foams, causal dynamical triangulations, causal set theory, group field theory and asymptotic safety. The latter has recently brought a new perspective on the field: supplementing the quantum-gravity sector of the theory by the matter degrees of freedom of the SM opens an exciting window through which to confront the construction with existing particle-physics data. As a result, this leads to new avenues of research at the intersection between particle physics and gravity, marking the onset of a new era in quantum-gravity research in which the field travels from a purely theoretical to an observationally guided endeavour.

Engineering materials for big science

HL-LHC coils up close

The nature of CERN’s research often demands unusual and highly complex materials to be developed and tested. A good example is the LHC beam screen that limits the energy transfer from the beam to the cold mass of the magnets, for which a new non-magnetic stainless steel had to be developed in the mid-1990s to meet the physical and mechanical requirements at cryogenic temperatures. The same is true of the external cylinder of the CMS solenoid magnet, for which a process enabling the production of 7 m-diameter high-strength seamless aluminium-alloy rings had to be identified and qualified. Another breakthrough at the LHC has been the solution adopted for the end covers of the cold masses of the dipole magnets, for which 2500 stainless-steel powder metallurgy-hot isostatic pressed covers were produced – qualifying this innovative shaping solution for the first time for massive, fully reliable leak-tight operation at cryogenic temperatures.

Similar challenges apply today for the High-Luminosity LHC (HL-LHC), which is due to operate from 2029. For the HL-LHC radio-frequency crab cavities, which will tilt the beams at the collision points to maximise the luminosity, niobium and niobium-titanium alloy products have been carefully identified and qualified. Niobium additive-manufactured at CERN achieved a record purity and conductivity for this kind of product. For the new HL-LHC magnets, which are necessary to focus the beams more tightly at the collision points, detailed qualifications of the soundness of niobium-tin (Nb3Sn) coils have been critical, as has the development and qualification of methods to test the weld of the quadrupole magnet cold masses.

These and numerous other projects are the domain of the CERN materials, metrology and non-destructive testing (EN–MME–MM) section, whose mission is to provide material sciences for accelerators and detectors spanning the whole CERN community, in close coordination with the mechanical design and production facilities of the EN-MME group. The interdisciplinary, expert-staffed section guarantees a full life-cycle management of mat­erials – from functional requirements to prototyping, series production, inspection and end-of-life – and includes the identification or development of material solutions, the specification and qualification of suppliers, the definition of manufacturing and inspection plans, and inspections of received materials and parts before and after their integration into the machines and experiments. This challenging mission requires advanced microscopic materials analysis, high-precision optical metrology, mechanical static and cyclic measurements, including at cryogenic temperatures, and, last but not least, state of the art non-destructive testing techniques (see “Section facilities” figure).

The facilities of the EN–MME–MM section

High-field magnets

The future of particle accelerators is strongly linked to the development of high–field superconducting magnets that enable higher energies and luminosities to be attained. The HL-LHC will be the first operational facility to employ high-performance Nb3Sn accelerator magnets, surpassing the intrinsic performance limitations of NbTi-based magnets as used for the LHC. The fabrication of Nb3Sn magnets is a challenging process because the conductor is an extremely brittle intermetallic phase. While the difficulty of working with brittle compounds is reduced using the traditional wind-react-and-impregnate approach, uncertainties remain due to volume changes associated with phase transformations occurring during the reaction heat treatment necessary to form the Nb3Sn phase.

Needle in a haystack

To investigate the root causes of performance limitation or degradation observed on early magnets, several HL-LHC dipole and quadrupole magnet coils were examined. This project has been one of the most complex failure analyses ever undertaken by the MM section, demanding an innovative investigation methodology to be identified and performed at several fabrication stages and after cool-down and powering. Internal shear and bending loads on unsupported superconducting wires, which can cause their dislocation as well as cracks in the aggregates of Nb3Sn filaments, were suspected to be the main cause of limitation or degradation. Like hunting for a needle within a massive haystack, the challenge was to find microscopic damage at the level of the filaments in the large volume of coils covering a length up to 7.2 m.

Dipole diagnostics

Starting in 2020 with 11 T magnet-coil ends, a sequence of mesoscale observations of whole coil sections was carried out non-destructively using innovative high-energy X-ray computed tomography (CT). This enabled the critical volumes to be identified and was followed up with a microscopic assessment of internal events, geometrical distortions and potential flaws using advanced microscopy. As a result, the MM section was able to unequivocally identify strands with transversely broken elements (see “Dipole diagnostics” and “Cracking niobium tin” figures). Techniques such as scanning electron microscopy (SEM) and focussed ion beam (FIB) were used to analyse damage to strands or sub-elements at particular localised positions as well as failure modes. In addition, a deep-etching technique allowed a decisive observation of completely broken filaments (see “HL-LHC coils up close” figure). Taken together, this comprehensive approach provided an in-depth view of the examined coils by identifying and characterising atypical features and imperfections in both the superconducting phase of the strands and the glass fibre/resin insulation system. It also clearly associated the quenches (a sudden loss of the superconducting state) experienced by the coils with physical events, namely broken superconducting filaments or damaged strands. The successful analysis of the CERN coil limitations led the MM section to receive several coils from different non-conforming quadrupole magnets, fabricated in the US in the framework of the Accelerator Upgrade Project collaboration, and successfully carry out the same type of investigations.

Cracking niobium tin

Effective recovery

This highly effective approach and key results on Nb3Sn accelerator magnets were made possible thanks to the wide experience gained with previous applications of CT techniques to the magnet system of the ITER fusion experiment, which employs the Nb3Sn conductor on a massive scale. The aim of such investigations is not only to understand what went wrong, no matter how difficult and complex that might be, but also to identify remedial actions. For the HL-LHC magnets, the MM section has contributed widely to the introduction of effective recovery measures, improved coil manufacturing and cold-mass assembly processes, and the production of magnets with reproducible behaviour and no sign of degradation. These results led to the conclusion that the root cause of the performance limitation of previous long CERN magnets has been identified and can now be overcome for future applications, as is the case for Nb3Sn quadrupole magnets.

Structural support

Investigating the massive HL-LHC coils required a highenergy (6 MeV) linac CT that was subcontracted to TEC Eurolab in Italy and Diondo GmbH in Germany, two of only a few companies in the world that are equipped with this technique. However, the MM section also has an X-ray CT facility with an energy of 225 keV, which enables sufficient penetration for less massive samples. One of the most recent of countless examples employing this technique concerns the staves for the future ATLAS tracker (ITk) for the HL-LHC upgrade. During 2023 a significant fraction of the ITk modules suffered from early high-voltage breakdowns, despite appearing to perform satisfactorily during earlier stages of quality control. A subset of these modules exhibited breakdowns following thermal cycling, with some failing during the cold phases of the cycle. Additionally, others experienced breakdowns after being loaded onto their supporting staves. High-resolution CT scans at CERN combined with other techniques confirmed the presence and propagation of cracks through the entire sensor thickness, and enabled the MM team to identify the gluing process between the carbon structure and the sensors as the root cause of the vulnerability, which is now being addressed by the ATLAS project team (see “ATLAS modules” figure). Also for the HL-LHC, the section is working on the internalisation process of the beryllium vacuum-chamber fabrication technology required for the experiments.

ATLAS modules

While carrying out failure analyses of extremely high-tech components is the core business of the MM section, in some cases understanding the failure of the most basic objects can be paramount. This does not necessarily mean that the investigations are simpler. At 11 a.m. on 13 October 2022, a pipe supplying CERN with water burst under the main road near the French–Swiss border, which was closed until early afternoon. The damage was quickly repaired by the Swiss services, and the road re-opened. But it was critical to understand if this was an isolated incident of an individual pipe, in service for 20 years, or if there was the potential risk of bursts in other ducts of the same type.

The services of the MM section, provided via cooperation agreements with CERN, are in wide demand externally

The damaged portion of the duct, measuring 1.7 m in length and 0.5 m in diameter, is the largest sample ever brought to the MM facilities for root-cause analysis (see “Water pipe” figure). As such, it required most of the available techniques to be deployed. For the receiving inspections, visual and radiographic testing and high-precision optical dimensional metrology in a volume of almost 17 m3 were used. For microstructural examinations, tests by CT, microoptical and SEM observations on the samples surrounding the crack – including a post-resin burn-off test – were carried out. The cracking (one of the most common found in water and sewer pipes) turned out to be the result of bending forces due to local soil movement. This generated a flexural constraint between the supported ends of the failing section, consisting of a concrete base on one side and a connection sleeve to the next pipe section on the opposite side. The change of boundary conditions may have been due to droughts during summer periods that altered the soil conditions. To the great relief of all, the composite material of the pipe or its constituents were not the main cause of the failure.

Beyond CERN

The services of the MM section, provided via cooperation agreements with CERN, are also in wide demand externally. ITER is a strong example. As of 2009, a major multi-year cooperation agreement is in place specifically covering metallurgical and material testing for the construction of the ITER magnet and vacuum systems. The many results and achievements of this long-lasting cooperation include: the qualification of high-strength stainless-steel jacket material for the conductor of the ITER central solenoid, including their cryogenic properties; the development and application of advanced examination techniques to assess the vacuum pressure impregnation process used in the correction coils and their critical welds, which are not inspectable with conventional techniques; and the assessment of a high-strength austenitic stainless steel for the precompression structure of the central solenoid, involving forgings featuring an unprecedented combination of size and aspect ratio. The section has also been fully entrusted by the ITER organisation for major failure analysis, such as the root-cause analysis of a heavy gauge fastener of the toroidal-field gravity support system and, more recently, the analysis of leakage events in the thermal-shield cooling pipes of the ITER magnet system. Several agreements are also in place via the CERN knowledge transfer group for the assessment of structural materials for a fusion project beyond ITER, and for a subcritical fission reactor project.

Failure analysis of a water pipe

Also not to be forgotten is the major involvement of CERN in the Einstein Telescope project, for example in assessing suitable materials and fabrication solutions for its vacuum system, one of the largest ultra-high vacuum systems ever built. A three-year-long project that started in September 2022 aims to deliver the main technical design report for the Einstein Telescope beampipes, in which CERN’s contribution is structured in eight work packages spanning design and materials choice to logistics, installation and surface treatments (CERN Courier September/October 2023 p45).

Beyond fundamental physics, the section is also working on the selection of materials for a future hydrogen economy, namely the definition of the proper specification and procedures for operation in a liquid-hydrogen environment. The watchmaking industry, which places high requirements on materials, also cooperates in this field. It is expected that the section will also receive requests for even more collaboration projects for different fields.

It is quite true to say that materials are everywhere. The examples given here clearly show that in view of the ambitious goals of CERN, a highly interdisciplinary effort from materials and mechanical engineers is paramount to the proper selection and qualification of materials, parts and processes to enable the creation of the giant colliders and detectors that allow physicists to explore the fundamental constituents of the universe.

SHiP to chart hidden sector

Layout of the SHiP experiment

In March, CERN selected a new experiment called SHiP to search for hidden particles using high-intensity proton beams from the SPS. First proposed in 2013, SHiP is scheduled to operate in the North Area’s ECN3 hall from 2031, where it will enable searches for new physics at the “coupling frontier” complementary to those at high-energy and precision-flavour experiments.

Interest in hidden sectors has grown in recent years, given the absence of evidence for non-Standard Model particles at the LHC, yet the existence of several phenomena (such as dark matter, neutrino masses and the cosmic baryon asymmetry) that require new particles or interactions. It is possible that the reason why such particles have not been seen is not that they are too heavy but that they are light and extremely feebly interacting. With such small couplings and mixings, and thus long lifetimes, hidden particles are extremely difficult to constrain. Operating in a beam-dump configuration that will produce copious quantities of photons and charm and beauty hadrons, SHiP will generically explore hidden-sector particles in the MeV to multiple-GeV mass range.

Optimised searching

SHiP is designed to search for signatures of models with hidden-sector particles, which include heavy neutral leptons, dark photons and dark scalars, by full reconstruction and particle identification of Standard Model final states. It will also search for light–dark-matter scattering signatures via the direct detection of atomic–electron or nuclear recoils in a high-density medium, and is optimised to make measurements of tau neutrinos and of neutrino-induced charm production by all three neutrinos species.

The experiment will be built in the existing TCC8/ECN3 experimental facility in the North Area. The beam-dump setup consists of a high-density proton target located in the target bunker, followed by a hadron stopper and a muon shield. Sharing the SPS beam time with other fixed-target experiments and the LHC should allow around 6 × 1020 protons on target to be produced during 15 years of nominal operation. The detector itself consists of two parts that are designed to be sensitive to as many physics models and final states as possible. The scattering and neutrino detector will search for light dark matter and perform neutrino measurements. Further downstream is the much larger hidden-sector decay spectrometer, which is designed to reconstruct the decay vertex of a hidden-sector particle, measure its mass and provide particle identification of the decay products in an extremely low-background environment.

One of the most critical and challenging components of the facility is the proton target, which has to sustain an energy of 2.6 MJ impinging on it every 7.2 s. Another is the muon shield. To control the beam-induced background from muons, the flux in the detector acceptance must be reduced by some six orders of magnitude over the shortest possible distance, for which an active muon shield entirely based on magnetic deflection has been developed.

One of the most critical and challenging components of the facility is the proton target

The focus of the SHiP collaboration now is to produce technical design reports. “Given adequate funding, we believe that the TDR phase for BDF/SHiP will take us about three years, followed by production and construction, with the aim to commission the facility towards the end of 2030 and the detector in 2031,” says SHiP spokesperson Andrey Golutvin of Imperial College London. “This will allow up to two years of data-taking during Run 4, before the start of Long Shutdown 4, which would be the obvious opportunity to improve or consolidate, if necessary, following the experience of the first years of data taking.”

The decision to proceed with SHiP concluded a process that took more than a year, involving the Physics Beyond Colliders study group and the SPS and PS experiments committee. Two other experiments, HIKE and SHADOWS, were proposed to exploit the high-intensity beam from the SPS. Continuing the successful tradition of kaon experiments in the ECN3 hall, which currently hosts the NA62 experiment, HIKE (high-intensity kaon experiment) proposed to search for new physics in rare charged and neutral kaon decays while also allowing on-axis searches for hidden particles. For SHADOWS (search for hidden and dark objects with the SPS), which would have taken data concurrently with HIKE when the beamline is operated in beam-dump mode, the focus was low-background searches for off-axis hidden-sector particles in the MeV-GeV region.

“In terms of their science, SHiP and HIKE/SHADOWS were ranked equally by the relevant scientific committees,” explains CERN director for research and computing Joachim Mnich. “But a decision had to be made, and SHiP was a strategic choice for CERN.”

Next-generation triggers for HL-LHC and beyond

ATLAS and CMS events at 13.6 TeV

The LHC experiments have surpassed expectations in their ability to squeeze the most out of their large datasets, also demonstrating the wealth of scientific understanding to be gained from improvements to data-acquisition pipelines. Colliding proton bunches at a rate of 40 MHz, the LHC produces a huge quantity of data that must be filtered in real-time to levels that are manageable for offline computing and ensuing physics analysis. When the High-Luminosity LHC (HL-LHC) enters operation from 2029, the data rates and event complexity will further increase significantly.

To meet this challenge, the general-purpose LHC experiments ATLAS and CMS are preparing significant detector upgrades, which include improvements in the online filtering or trigger-selection processes. In view of the importance of this step, the collaborations seek to further enhance their trigger and analysis capabilities, and thus their scientific potential, beyond their currently projected scope.

Following a visit by a group of private donors, in 2023 CERN, in close collaboration with the ATLAS and CMS collaborations, submitted a proposal to the Eric and Wendy Schmidt Fund for Strategic Innovation, which resulted in the award of a $48 million grant. The donation laid the foundations of the Next Generation Triggers project, which kicked off in January 2024. The five-year-long project aims to accelerate novel computing, engineering and scientific ideas for the ATLAS and CMS upgrades, also taking advantage of advanced AI techniques, not only in large-scale data analysis and simulation but also embedded in front-end detector electronics. These include quantum-inspired algorithms to improve simulations, and heterogeneous computing architectures and new strategies to optimise the performance of GPU-accelerated experiment code. The project will also provide insight to detectors and data flows for future projects, such as experiments at the proposed Future Circular Collider, while the associated infrastructure will support the advancement of software and algorithms for simulations that are vital to the HL-LHC and future-collider physics programmes. Through the direct involvement of the CERN experimental physics, information technology and theory departments, it is expected that results from the project will bring benefits across the lab’s scientific programme.

The Next Generation Triggers project is broken down into four work packages: infrastructure, algorithms and theory (to improve machine learning-assisted simulation and data collection, develop common frameworks and tools, and better leverage available and new computing infrastructures and platforms); enhancing the ATLAS trigger and data acquisition (to focus on improved and accelerated filtering and exotic signature detection); rethinking the CMS real-time data processing (to extend the use of heterogeneous computing to the whole online reconstruction and to design a novel AI-powered real-time processing workflow to analyse every collision); and education programmes and outreach to engage the community, industry and academia in the ambitious goals of the project, foster and train computing skills in the next generation of high-energy physicists, and complement existing successful community programmes with multi-disciplinary subjects across physics, computing science and engineering.

“The Next Generation Triggers project builds upon and further enhances the ambitious trigger and data acquisition upgrades of the ATLAS and CMS experiments to unleash the full scientific potential of the HL-LHC,” says ATLAS spokesperson Andreas Hoecker.

“Its work packages also benefit other critical areas of the HL-LHC programme, and the results obtained will be valuable for future particle-physics experiments at the energy frontier,” adds Patricia McBride, CMS spokesperson.

CERN will have sole discretion over the implementation of the Next Generation Triggers scientific programme and how the project is delivered overall. In line with its Open Science Policy, CERN also pledges to release all IP generated as part of the project under appropriate open licences.

European strategy update

On 21 March the CERN Council decided to launch the process for updating the European strategy for particle physics – the cornerstone of Europe’s decision-making process for the long-term future of the field. Mandated by the CERN Council, the European strategy is formed through a broad consultation of the particle-physics community and in close coordination with similar processes in the US and Japan, to ensure coordination between regions and optimal use of resources globally.

The deadline for submitting written input for the next strategy update has been set for 31 March 2025, with a view to concluding the process in June 2026. The strategy process is managed by the strategy secretariat, which the Council will establish during its June 2024 session.

The European strategy process was initiated by the CERN Council in 2005, placing the LHC at the top of particle physics’ scientific priorities, with a significant luminosity upgrade already being mooted. A ramp-up of R&D for future accelerators also featured high on the priority list, followed by coordination with a potential International Linear Collider and participation in a global neutrino programme.

The final report of the FCC feasibility study will be a key input for the next strategy update

The first strategy update in 2013, which kept the LHC as a top priority and attached increasing importance to its high-luminosity upgrade, stated that Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next strategy update. The latter charge was formulated in more detail in the second strategy update, completed in 2020, which recommended a Higgs factory as the highest priority to follow the LHC and that a technical and financial feasibility study should be pursued in parallel for a next-generation hadron collider at the highest achievable energy. A mid-term report on the resulting Future Circular Collider feasibility study was submitted for review at the end of 2023 (CERN Courier March/April 2024 pp25–38) and the final report, expected in March 2025, will be a key input for the next strategy update.

More information about the third update of the European strategy, together with the call for input, will be issued by the strategy secretariat in due course.

bright-rec iop pub iop-science physcis connect