Ultra-peripheral collisions (UPCs) involving heavy ions and protons represent the energy frontier for photon-induced reactions. These high-energy photons can be used to study unique features of quarks and gluons inside nuclei, and can probe electromagnetic and electroweak interactions without the usual backgrounds associated with quantum-chromodynamic processes. The first edition of the international workshop on this subject took place from 10 to 15 December 2023 in Playa del Carmen, Mexico, bringing together about 90 participants, more than a third of whom were early-career researchers. This is the first time that the international UPC community has gathered together, establishing a new international conference series on this active and expanding area of research.
The conference highlighted the impressive progress and diversity of UPC physics, which goes far beyond the initial studies of exclusive pro-cesses. UPC23 covered the latest results from experiments at RHIC and the LHC, and prospects for the future Electron-Ion Collider (EIC) at Brookhaven National Laboratory. Discussions delved into the intricacies of inelastic photo-nuclear events, including the exciting programme of open charm that is yet to be explored, and examined how UPCs serve as a novel lens for investigating the quark–gluon plasma and other final-state nuclear effects. Lots of attention was devoted to the physics of low-x parton densities – a fundamental aspect of protons and nuclei that photons can probe in a unique way.
Enriched understanding
Among the conference’s theoretical highlights, Farid Salazar (UCLA) showed how vector–meson photoproduction could be a powerful method to detect gluon saturation across different collision systems, from proton–nucleus to electron–nucleus to UPCs. Zaki Panjsheeri (Virginia) put forth innovative ideas to study double-parton correlations, linking UPC vector–meson studies to generalised parton distributions, enhancing our understanding of the proton’s structure. Ashik Ikbal (Kent State), meanwhile, introduced exciting proposals to investigate quantum entanglement through exclusive J/ψ photoproduction at RHIC.
The conference also provided a platform for discussing the active exploration of light-by-light scattering and two-photon processes for probing fundamental physics and searches for axion-like particles, and for putting constraints on the anomalous magnetic moment of the tau lepton (see CMS closes in on tau g–2).
Energy exploration
Physicists at the LHC have effectively repurposed the world’s most powerful particle accelerator into a high-energy photon collider. This innovative approach, traditionally the domain of electron beams in colliders like LEP and HERA, and anticipated at the EIC, allows the LHC to explore photon-induced interactions at energies never before achieved. David Grund (Czech Technical University in Prague), Georgios Krintiras (Kansas) and Cesar Luiz Da Silva (Los Alamos) shared the latest LHC findings on the energy dependence of UPC J/ψ events. These results are crucial for understanding the onset of gluon saturation – a state where gluons become so dense reaching saturation, the dynamical equilibrium where the emission and recombination occurs. However, the data also align with the nuclear phenomenon known as gluon shadowing, which arises from multiple-scattering processes. David Tlusty (Creighton) presented the latest findings from the STAR Collaboration, which has recently expanded its UPC programme, complementing the energy exploration at the LHC. Klaudia Maj (AGH University of Krakow) presented the latest results on two-photon interactions and photonuclear jets from the ATLAS collaboration, including measurements that may be probing the quark-gluon plasma.
Delegates discussed the future opportunities for UPC physics with the large integrated luminosity expected for Runs 3 and 4 at the LHC
Carlos Bertulani (Texas A&M) paid tribute to Gerhard Baur, who passed away on June 16 last year. Bertulani and Baur co-authored “Electromagnetic processes in relativistic heavy ion collisions” – a seminal paper with more than 1000 citations. Bertulani invited delegates to consider the untapped potential of UPCs in the study of anti-atoms and exotic atoms.
Delegates also discussed the future opportunities for UPC physics with the large integrated luminosity expected for Run 3 and Run 4 at the LHC, with the planned detector upgrades for Run 4 such as FoCal, the recent upgrades by STAR, the sPHENIX programme and at the EIC. Delegates are expecting event selection and instrumentation close to the beam line, for example using “zero degree” calorimeters, to offer the greatest experimental opportunities in the coming years.
The next edition of the UPC conference will take place in Saariselka, Finland in June 2025.
Since his birth in Bohemia in 1924, Herwig Schopper has been a prisoner of war, an experimentalist with pioneering contributions in nuclear, accelerator and detector physics, director general (DG) of DESY and then CERN during a golden age for particle physics, and a celebrated science diplomat. Shortly after his centenary, his colleagues, family and friends gathered on 1 March to celebrate the life of the first DG in either institution to reach 100.
“He is a restless person,” noted Albrecht Wagner (DESY), who presented a whistlestop tour of Schopper’s 35 years working in Germany, following his childhood in Bohemia. Whether in Hamburg, Erlangen, Mainz or Karlsruhe, he never missed out on an opportunity to see new places – though always maintaining the Austrian diet to which his children attribute his longevity. On one occasion, Schopper took a sabbatical to work with Lise Meitner in Stockholm’s Royal Institute of Technology. At the time, the great physicist was performing the first nuclear-physics studies in the keV range, said Wagner, and directed Schopper to measure the absorption rate of beta-decay electrons in various materials using radioactive sources and a Geiger–Müller counter. Schopper is one of the last surviving physicists to have worked with her, observed Wagner.
Schopper’s scientific contributions have included playing a major part in the world’s first polarised proton source, Europe’s first R&D programme for superconducting accelerators and the development of hadronic calorimeters as precision instruments, explained Christian Fabjan (TU Vienna/HEPHY). Schopper dubbed the latter the sampling total absorption calorimeter, or STAC, playing on the detector’s stacked design, but the name didn’t stick. In recognition of his contributions, hadronic calorimeters might now be renamed Schopper total absorption calorimeters, joked Fabjan.
As CERN DG from 1981 to 1988, Schopper oversaw the lion’s share of the construction of the LEP, before it began operations in July 1989. To accomplish this, he didn’t shy away from risks, budget cuts or unpopular opinions when the situation called for it, said Chris Llewellyn Smith, who would himself serve as DG from 1994 to 1998. Llewelyn Smith credited Schopper with making decisions that would benefit not only LEP, but also the LHC. “Watching Herwig deal with these reviews was a wonderful apprenticeship, during which I learned a lot about the management of CERN,” he recalled.
After passing CERN’s leadership to Carlo Rubbia, Schopper became a fulltime science diplomat, notably including 20 years in senior roles at UNESCO between 1997 and 2017, and significant contributions to SESAME, the Synchrotron-light for Experimental Science and Applications in the Middle East (see CERN Courier January/February 2023, p28). Khaled Toukan of Jordan’s Atomic Energy Commission, CERN Council president Eliezer Rabinovici and Maciej Nałecz (Polish Academy of Science, formerly of UNESCO) all spoke of Schopper’s skill in helping to develop SESAME as a blueprint for science for peace and development. “Herwig likes building rings,” Toukan fondly recounted.
As with any good birthday party, Herwig received gifts: a first copy of his biography, a NASA hoodie emblazoned with “Failure is not an option” from Sam Ting (MIT), who is closely associated with Schopper since their time together at DESY, and the Heisenberg medal. “You’ve even been in contact with the man himself,” noted Heisenberg Society president Johannes Blümer, referring to several occasions Schopper met Heisenberg at conferences and even once discussed politics with him.
Schopper continues to counsel DGs to this day – and not only on physics. Confessing to occasionally being intimidated by his lifetime of achievements, CERN DG Fabiola Gianotti intimated that they often discuss music. “Herwig likes all composers, but not baroque ones. For him, they are too rational and intellectual.” For this, he will always have physics.
The triennial international conference on meson–nucleon physics and the structure of the nucleon (MENU) attracted more than 140 participants to the historic centre of Mainz from 16 to 20 October 2023.
Among MENU 2023’s highlights on nucleon structure, a preliminary analysis by the NNPDF collaboration suggests that the proton contains more charm than anticharm, with Niccolò Laurenti (Università degli Studi di Milano) showing evidence of a non-vanishing intrinsic valence charm contribution to the proton’s wavefunction. Meanwhile, Michael Kohl (Hampton University) concluded that the proton–radius puzzle is still not resolved. To make progress, form-factor measurements in electron scattering must be scrutinised, and the use of atomic spectroscopy data clarified, he said.
Hadron physics
A large part of this year’s conference was dedicated to hadron spectroscopy, with updates from Belle II, BESIII, GlueX, Jefferson Lab, JPAC, KLOE/KLOE-2 and LHCb, as well as theoretical overviews covering everything from lattice quantum chromodynamics to effective-field theories. Special emphasis was also given to future directions in hadron physics at future facilities such as FAIR, the Electron-Ion Collider and the local Mainz Energy-Recovering Superconducting Accelerator (MESA) facility – a future low-energy but high-intensity electron accelerator that will make it possible to carry out experiments in nuclear astrophysics, dark-sector searches and tests of the SM. Among upgrade plans at Jefferson Lab, Eric Voutier (Paris-Saclay) presented a future experimental programme with positron beams at CEBAF, the institute’s Continuous Electron Beam Accelerator Facility. The upgrade will allow for a rich physics programme covering two-photon exchange, generalised polarisabilities, generalised parton distribution functions and direct dark-matter searches.
Highlights on nucleon structure include a preliminary analysis suggesting that the proton contains more charm than anticharm
Hadron physics is also closely related to searches for new physics, as precision observables of the Standard Model are in many cases limited by the non-perturbative regime of quantum chromodynamics. A prime example is the physics of the anomalous magnetic moment of the muon, for which a puzzling discrepancy between data-driven dispersive and lattice–quantum chromodynamics calculations of hadronic contributions to the Standard Model prediction persists (CERN Courier May/June 2021 p25). The upcoming collaboration meeting of the Muon g-2 Theory Initiative in September 2024 at KEK will provide important new insights from lattice QCD and e+e– experiments. It remains to be seen whether the eventual theoretical consensus will confirm a significant deviation from the experimental value, which is currently being updated by Fermilab’s Muon g-2 experiment using their last three years of data.
The LHC experiments at CERN have been extremely successful in verifying the Standard Model (SM) of particle physics to very high precision. From the theoretical perspective, however, this model has two conceptual shortcomings. One is that the SM appears to be an “effective field theory” that is valid up to a certain energy scale only; the other is that gravity is not part of the model. This raises the question of what a theory comprising particle physics and gravity that is valid for all energy scales might look like. This directly leads to the domain of quantum gravity.
The typical scale associated with quantum-gravity effects is the Planck scale: 1015 TeV, or 10–35 m. This exceeds the scales accessible at the LHC by approximately 14 orders of magnitude, forcing us to ask: what can theorists possibly gain from investigating physics at energies beyond the Planck scale? The answer is simple: the SM includes many free parameters that must be fixed by experimental data. Since the number of these parameters proliferates when higher order interactions are included, one would like to constrain this high-dimensional parameter space.
At low energies, this can be done by implementing bounds derived from demanding unitarity and causality of physical processes. Ideally, one would like to derive similar constraints from consistency at trans-Planckian scales where quantum-gravity effects may play a major role. At first sight, this may seem counterintuitive. It is certainly true that gravity treated as an effective field theory itself does not yield any effect measurable at LHC scales due to its weakness; the additional constraints then arise from requiring that the effective field theories underlying the SM and gravity can be combined and extended into a framework that is valid at all energy scales. Presumably, this will not work for all effective field theories. Taking a “bottom-up” approach (identifying the set of theories for which this extension is possible) may constrain the set of free parameters. Conversely, to be phenomenologically viable, any theory describing trans-Planckian physics must be compatible with existing knowledge at the scales probed by collider experiments. This “top-down” approach may then constrain the potential physics scenarios happening at the quantum-gravity scale – a trajectory that has been followed, for example, by the swampland programme initiated from string theory at all scales.
From the theoretical viewpoint, the SM is formulated in the language of relativistic quantum field theories. On this basis, it is possible that the top-down route becomes more realistic the closer the formulation of trans-Planckian physics sticks to this language. For example, string theory is a promising candidate for a consistent description of trans-Planckian physics. However, connecting the theory to the SM has proven to be very difficult, mainly due to the strong symmetry requirements underlying the formulation. In this regard, the “asymptotic safety” approach towards quantum gravity may offer a more tractable option for implementing the top-down idea since it uses the language of relativistic quantum field theory.
Asymptotic safety
What is the asymptotic-safety scenario, and how does it link quantum gravity to particle physics? Starting from the gravity side, we have a successful classical theory: Einstein’s general relativity. If one tries to upgrade this to a quantum theory, things go wrong very quickly. In the early 1970s, it was shown by Gerard ’t Hooft and Martinus Veltman that applying the perturbative quantisation techniques that have proved highly successful for particle-physics theories fail for general relativity. In short, it introduces an infinite number of parameters (one for each allowed local interaction) and thus requires an infinite number of independent measurements to determine what the values of those parameters are. Although this path leads us to a quantum theory of gravity valid at all scales, the construction lacks predictive power. Still, it results in a perfectly predictive effective field theory describing gravity up to the Planck scale.
This may seem discouraging when attempting to formulate a quantum field theory of gravity without introducing new symmetry principles, for example supersymmetry, to remove additional free parameters. A loophole is provided by Kenneth Wilson’s modern understanding of renormalisation. Here, the basic idea is to organise quantum fluctuations according to their momentum and integrate-out these fluctuations, starting from the most energetic ones and proceeding towards lower energy modes. This creates what is called the Wilsonian renormalisation-group “flow” of a theory. Healthy high-energy completions are provided by renormalisation-group fixed points. At these special points the theory becomes scale-invariant, which ensures the absence of divergences. The fixed point also provides predictive power via the condition that the renormalisation-group flow hits the fixed point at high energies (see “Safety belt” figure). For asymptotically-free theories, where all interactions switch off at high energies, the underlying renormalisation-group fixed point is the free theory. This can be seen in the example of quantum chromodynamics (QCD): if the QCD gauge coupling diminishes when going to higher and higher energies, it approaches a fixed point at arbitrary high energies that is non-interacting. One can also envision high-energy completions based on a renormalisation-group fixed point with non-vanishing interactions, which is commonly referred to as asymptotic safety.
Forces of nature
In the context of gravity, the asymptotic-safety scenario was first proposed by Steven Weinberg in the late 1970s. Starting with the seminal work by Martin Reuter (University of Mainz) in 1998, the existence of a renormalisation-group fixed point suitable for rendering gravity asymptotically safe – the so-called Reuter fixed point – is supported by a wealth of first-principle computations. While similar constructions are well known in condensed-matter physics, the Reuter fixed point is distinguished by the fact that it may provide a unified description of all forces of nature. As such, it may have profound consequences for our understanding of the physics inside a black hole, give predictions for parameters of the SM such as the Higgs-boson mass, or disfavour certain types of physics beyond the SM.
The asymptotic-safety approach towards quantum gravity may offer a more tractable option for implementing the top-down idea
The predictive power of the fixed point arises as follows. Only a finite set of parameters exist that describe consistent quantum field theories emanating from the fixed point. One then starts to systematically integrate-out quantum fluctuations (from high to low energy), resulting in a family of effective descriptions in which the quantum fluctuations are taken into account. In practice, this process is implemented by the running of the theory’s couplings, generating what are known as renormalisation-group trajectories. To be phenomenologically viable, the endpoint of the renormalisation group trajectory must be compatible with observations. In the end, only one (or potentially none) of the trajectories emanating from the fixed point will provide a description of nature (see “Going with the flow” image). According to the asymptotic-safety principle, this trajectory must be identified by fixing the free parameters left by the fixed point based on experiments. Once this process is completed, the construction fixes all couplings in the effective field theory in terms of a few free parameters. Since this entails an infinite number of relations that can be probed experimentally, the construction is falsifiable.
Particle physics link
The link to particle physics follows from the observation that the asymptotic-safety construction remains operative once gravity is supplemented by the matter fields of the SM. Non-abelian gauge groups – such as those underlying the electroweak and strong forces, Yukawa interactions and fermion masses – are readily accommodated. A wide range of proof-of-concepts show that this is feasible, gradually bringing the ultimate computation involving the full SM into reach. The fact that gravity remains interacting at the smallest length scales too implies that the construction will feature non-minimal couplings between matter and the gravitational field as well as matter self-interactions of a very specific type. The asymptotic-safety mechanism may then provide the foundation for a realistic quantum field theory unifying all fundamental forces of nature.
Can particle physics tell us whether this specific idea about quantum gravity is on the right track? After all there still exists the vast hierarchy between the energy scales probed by collider experiments and the Planck scale. Surprisingly, the answer is positive! Conceptually, the interacting renormalisation-group fixed point for the gravity–matter theory again gives a set of viable quantum field theories in terms of a fixed number of free parameters. First estimates conducted by Jan Pawlowski and coworkers at Heidelberg University suggest that this number is comparable to the number of free parameters in the SM.
In practice, one may then be tempted to make the following connection. Currently, observables probed by collider physics are derived from the SM effective field theory. Hence, they depend on the couplings of the effective field theory. The asymptotic-safety mechanism expresses these couplings in terms of the free parameters associated with the interacting fixed point. Once the SM effective field theory is extended to include operators of sufficiently high mass dimension, the asymptotic-safety dictum predicts highly non-trivial relations between the couplings parameterising the effective field theory. These relations can be confronted with observations that test whether the observables measured experimentally are subject to these constraints. This can either be provided by matching to existing particle-physics data obtained at the LHC, or by astrophysical observations probing the strong-gravity regime. The theoretical programme of deriving such relations is currently under development. A feasible benchmark, showing that the underlying physics postulates are on the right track, would then be to “post-dict” the experimental results already available. Showing that a theory formulated at the Planck scale is compatible with the SM effective field theory would be a highly non-trivial achievement in itself.
Showing that a theory formulated at the Planck scale is compatible with the SM effective field theory would be a highly non-trivial achievement in itself
This line of testing quantum gravity experimentally may be seen as orthogonal to more gravity-focused tests that attempt to decipher the quantum nature of gravity. Recent ideas in these directions have evolved around developing tabletop experiments that probe the quantum superposition of macroscopic objects at sub-millimetre scales, which could ultimately be developed into a quantum-Cavendish experiment that probes the gravitational field of source masses in spatial quantum superposition states. The emission of a graviton could then lead to decoherence effects which give hints that gravity indeed has a force carrier similar to the other fundamental forces. Of course, one could also hope that experiments probing gravity in the strong-gravity regime find deviations from general relativity. So far, this has not been the case. This is why particle physics may be a prominent and fruitful arena in which to also test quantum-gravity theories such as asymptotic safety in the future.
For decades, quantum-gravity research has been disconnected from directly relevant experimental data. As a result, the field has developed a vast variety of approaches that aim to understand the laws of physics at the Planck scale. These include canonical quantisation, string theory, the AdS/CFT correspondence, loop quantum gravity and spin foams, causal dynamical triangulations, causal set theory, group field theory and asymptotic safety. The latter has recently brought a new perspective on the field: supplementing the quantum-gravity sector of the theory by the matter degrees of freedom of the SM opens an exciting window through which to confront the construction with existing particle-physics data. As a result, this leads to new avenues of research at the intersection between particle physics and gravity, marking the onset of a new era in quantum-gravity research in which the field travels from a purely theoretical to an observationally guided endeavour.
The nature of CERN’s research often demands unusual and highly complex materials to be developed and tested. A good example is the LHC beam screen that limits the energy transfer from the beam to the cold mass of the magnets, for which a new non-magnetic stainless steel had to be developed in the mid-1990s to meet the physical and mechanical requirements at cryogenic temperatures. The same is true of the external cylinder of the CMS solenoid magnet, for which a process enabling the production of 7 m-diameter high-strength seamless aluminium-alloy rings had to be identified and qualified. Another breakthrough at the LHC has been the solution adopted for the end covers of the cold masses of the dipole magnets, for which 2500 stainless-steel powder metallurgy-hot isostatic pressed covers were produced – qualifying this innovative shaping solution for the first time for massive, fully reliable leak-tight operation at cryogenic temperatures.
Similar challenges apply today for the High-Luminosity LHC (HL-LHC), which is due to operate from 2029. For the HL-LHC radio-frequency crab cavities, which will tilt the beams at the collision points to maximise the luminosity, niobium and niobium-titanium alloy products have been carefully identified and qualified. Niobium additive-manufactured at CERN achieved a record purity and conductivity for this kind of product. For the new HL-LHC magnets, which are necessary to focus the beams more tightly at the collision points, detailed qualifications of the soundness of niobium-tin (Nb3Sn) coils have been critical, as has the development and qualification of methods to test the weld of the quadrupole magnet cold masses.
These and numerous other projects are the domain of the CERN materials, metrology and non-destructive testing (EN–MME–MM) section, whose mission is to provide material sciences for accelerators and detectors spanning the whole CERN community, in close coordination with the mechanical design and production facilities of the EN-MME group. The interdisciplinary, expert-staffed section guarantees a full life-cycle management of materials – from functional requirements to prototyping, series production, inspection and end-of-life – and includes the identification or development of material solutions, the specification and qualification of suppliers, the definition of manufacturing and inspection plans, and inspections of received materials and parts before and after their integration into the machines and experiments. This challenging mission requires advanced microscopic materials analysis, high-precision optical metrology, mechanical static and cyclic measurements, including at cryogenic temperatures, and, last but not least, state of the art non-destructive testing techniques (see “Section facilities” figure).
The future of particle accelerators is strongly linked to the development of high–field superconducting magnets that enable higher energies and luminosities to be attained. The HL-LHC will be the first operational facility to employ high-performance Nb3Sn accelerator magnets, surpassing the intrinsic performance limitations of NbTi-based magnets as used for the LHC. The fabrication of Nb3Sn magnets is a challenging process because the conductor is an extremely brittle intermetallic phase. While the difficulty of working with brittle compounds is reduced using the traditional wind-react-and-impregnate approach, uncertainties remain due to volume changes associated with phase transformations occurring during the reaction heat treatment necessary to form the Nb3Sn phase.
Needle in a haystack
To investigate the root causes of performance limitation or degradation observed on early magnets, several HL-LHC dipole and quadrupole magnet coils were examined. This project has been one of the most complex failure analyses ever undertaken by the MM section, demanding an innovative investigation methodology to be identified and performed at several fabrication stages and after cool-down and powering. Internal shear and bending loads on unsupported superconducting wires, which can cause their dislocation as well as cracks in the aggregates of Nb3Sn filaments, were suspected to be the main cause of limitation or degradation. Like hunting for a needle within a massive haystack, the challenge was to find microscopic damage at the level of the filaments in the large volume of coils covering a length up to 7.2 m.
Starting in 2020 with 11 T magnet-coil ends, a sequence of mesoscale observations of whole coil sections was carried out non-destructively using innovative high-energy X-ray computed tomography (CT). This enabled the critical volumes to be identified and was followed up with a microscopic assessment of internal events, geometrical distortions and potential flaws using advanced microscopy. As a result, the MM section was able to unequivocally identify strands with transversely broken elements (see “Dipole diagnostics” and “Cracking niobium tin” figures). Techniques such as scanning electron microscopy (SEM) and focussed ion beam (FIB) were used to analyse damage to strands or sub-elements at particular localised positions as well as failure modes. In addition, a deep-etching technique allowed a decisive observation of completely broken filaments (see “HL-LHC coils up close” figure). Taken together, this comprehensive approach provided an in-depth view of the examined coils by identifying and characterising atypical features and imperfections in both the superconducting phase of the strands and the glass fibre/resin insulation system. It also clearly associated the quenches (a sudden loss of the superconducting state) experienced by the coils with physical events, namely broken superconducting filaments or damaged strands. The successful analysis of the CERN coil limitations led the MM section to receive several coils from different non-conforming quadrupole magnets, fabricated in the US in the framework of the Accelerator Upgrade Project collaboration, and successfully carry out the same type of investigations.
This highly effective approach and key results on Nb3Sn accelerator magnets were made possible thanks to the wide experience gained with previous applications of CT techniques to the magnet system of the ITER fusion experiment, which employs the Nb3Sn conductor on a massive scale. The aim of such investigations is not only to understand what went wrong, no matter how difficult and complex that might be, but also to identify remedial actions. For the HL-LHC magnets, the MM section has contributed widely to the introduction of effective recovery measures, improved coil manufacturing and cold-mass assembly processes, and the production of magnets with reproducible behaviour and no sign of degradation. These results led to the conclusion that the root cause of the performance limitation of previous long CERN magnets has been identified and can now be overcome for future applications, as is the case for Nb3Sn quadrupole magnets.
Structural support
Investigating the massive HL-LHC coils required a highenergy (6 MeV) linac CT that was subcontracted to TEC Eurolab in Italy and Diondo GmbH in Germany, two of only a few companies in the world that are equipped with this technique. However, the MM section also has an X-ray CT facility with an energy of 225 keV, which enables sufficient penetration for less massive samples. One of the most recent of countless examples employing this technique concerns the staves for the future ATLAS tracker (ITk) for the HL-LHC upgrade. During 2023 a significant fraction of the ITk modules suffered from early high-voltage breakdowns, despite appearing to perform satisfactorily during earlier stages of quality control. A subset of these modules exhibited breakdowns following thermal cycling, with some failing during the cold phases of the cycle. Additionally, others experienced breakdowns after being loaded onto their supporting staves. High-resolution CT scans at CERN combined with other techniques confirmed the presence and propagation of cracks through the entire sensor thickness, and enabled the MM team to identify the gluing process between the carbon structure and the sensors as the root cause of the vulnerability, which is now being addressed by the ATLAS project team (see “ATLAS modules” figure). Also for the HL-LHC, the section is working on the internalisation process of the beryllium vacuum-chamber fabrication technology required for the experiments.
While carrying out failure analyses of extremely high-tech components is the core business of the MM section, in some cases understanding the failure of the most basic objects can be paramount. This does not necessarily mean that the investigations are simpler. At 11 a.m. on 13 October 2022, a pipe supplying CERN with water burst under the main road near the French–Swiss border, which was closed until early afternoon. The damage was quickly repaired by the Swiss services, and the road re-opened. But it was critical to understand if this was an isolated incident of an individual pipe, in service for 20 years, or if there was the potential risk of bursts in other ducts of the same type.
The services of the MM section, provided via cooperation agreements with CERN, are in wide demand externally
The damaged portion of the duct, measuring 1.7 m in length and 0.5 m in diameter, is the largest sample ever brought to the MM facilities for root-cause analysis (see “Water pipe” figure). As such, it required most of the available techniques to be deployed. For the receiving inspections, visual and radiographic testing and high-precision optical dimensional metrology in a volume of almost 17 m3 were used. For microstructural examinations, tests by CT, microoptical and SEM observations on the samples surrounding the crack – including a post-resin burn-off test – were carried out. The cracking (one of the most common found in water and sewer pipes) turned out to be the result of bending forces due to local soil movement. This generated a flexural constraint between the supported ends of the failing section, consisting of a concrete base on one side and a connection sleeve to the next pipe section on the opposite side. The change of boundary conditions may have been due to droughts during summer periods that altered the soil conditions. To the great relief of all, the composite material of the pipe or its constituents were not the main cause of the failure.
Beyond CERN
The services of the MM section, provided via cooperation agreements with CERN, are also in wide demand externally. ITER is a strong example. As of 2009, a major multi-year cooperation agreement is in place specifically covering metallurgical and material testing for the construction of the ITER magnet and vacuum systems. The many results and achievements of this long-lasting cooperation include: the qualification of high-strength stainless-steel jacket material for the conductor of the ITER central solenoid, including their cryogenic properties; the development and application of advanced examination techniques to assess the vacuum pressure impregnation process used in the correction coils and their critical welds, which are not inspectable with conventional techniques; and the assessment of a high-strength austenitic stainless steel for the precompression structure of the central solenoid, involving forgings featuring an unprecedented combination of size and aspect ratio. The section has also been fully entrusted by the ITER organisation for major failure analysis, such as the root-cause analysis of a heavy gauge fastener of the toroidal-field gravity support system and, more recently, the analysis of leakage events in the thermal-shield cooling pipes of the ITER magnet system. Several agreements are also in place via the CERN knowledge transfer group for the assessment of structural materials for a fusion project beyond ITER, and for a subcritical fission reactor project.
Also not to be forgotten is the major involvement of CERN in the Einstein Telescope project, for example in assessing suitable materials and fabrication solutions for its vacuum system, one of the largest ultra-high vacuum systems ever built. A three-year-long project that started in September 2022 aims to deliver the main technical design report for the Einstein Telescope beampipes, in which CERN’s contribution is structured in eight work packages spanning design and materials choice to logistics, installation and surface treatments (CERN Courier September/October 2023 p45).
Beyond fundamental physics, the section is also working on the selection of materials for a future hydrogen economy, namely the definition of the proper specification and procedures for operation in a liquid-hydrogen environment. The watchmaking industry, which places high requirements on materials, also cooperates in this field. It is expected that the section will also receive requests for even more collaboration projects for different fields.
It is quite true to say that materials are everywhere. The examples given here clearly show that in view of the ambitious goals of CERN, a highly interdisciplinary effort from materials and mechanical engineers is paramount to the proper selection and qualification of materials, parts and processes to enable the creation of the giant colliders and detectors that allow physicists to explore the fundamental constituents of the universe.
In March, CERN selected a new experiment called SHiP to search for hidden particles using high-intensity proton beams from the SPS. First proposed in 2013, SHiP is scheduled to operate in the North Area’s ECN3 hall from 2031, where it will enable searches for new physics at the “coupling frontier” complementary to those at high-energy and precision-flavour experiments.
Interest in hidden sectors has grown in recent years, given the absence of evidence for non-Standard Model particles at the LHC, yet the existence of several phenomena (such as dark matter, neutrino masses and the cosmic baryon asymmetry) that require new particles or interactions. It is possible that the reason why such particles have not been seen is not that they are too heavy but that they are light and extremely feebly interacting. With such small couplings and mixings, and thus long lifetimes, hidden particles are extremely difficult to constrain. Operating in a beam-dump configuration that will produce copious quantities of photons and charm and beauty hadrons, SHiP will generically explore hidden-sector particles in the MeV to multiple-GeV mass range.
Optimised searching
SHiP is designed to search for signatures of models with hidden-sector particles, which include heavy neutral leptons, dark photons and dark scalars, by full reconstruction and particle identification of Standard Model final states. It will also search for light–dark-matter scattering signatures via the direct detection of atomic–electron or nuclear recoils in a high-density medium, and is optimised to make measurements of tau neutrinos and of neutrino-induced charm production by all three neutrinos species.
The experiment will be built in the existing TCC8/ECN3 experimental facility in the North Area. The beam-dump setup consists of a high-density proton target located in the target bunker, followed by a hadron stopper and a muon shield. Sharing the SPS beam time with other fixed-target experiments and the LHC should allow around 6 × 1020 protons on target to be produced during 15 years of nominal operation. The detector itself consists of two parts that are designed to be sensitive to as many physics models and final states as possible. The scattering and neutrino detector will search for light dark matter and perform neutrino measurements. Further downstream is the much larger hidden-sector decay spectrometer, which is designed to reconstruct the decay vertex of a hidden-sector particle, measure its mass and provide particle identification of the decay products in an extremely low-background environment.
One of the most critical and challenging components of the facility is the proton target, which has to sustain an energy of 2.6 MJ impinging on it every 7.2 s. Another is the muon shield. To control the beam-induced background from muons, the flux in the detector acceptance must be reduced by some six orders of magnitude over the shortest possible distance, for which an active muon shield entirely based on magnetic deflection has been developed.
One of the most critical and challenging components of the facility is the proton target
The focus of the SHiP collaboration now is to produce technical design reports. “Given adequate funding, we believe that the TDR phase for BDF/SHiP will take us about three years, followed by production and construction, with the aim to commission the facility towards the end of 2030 and the detector in 2031,” says SHiP spokesperson Andrey Golutvin of Imperial College London. “This will allow up to two years of data-taking during Run 4, before the start of Long Shutdown 4, which would be the obvious opportunity to improve or consolidate, if necessary, following the experience of the first years of data taking.”
The decision to proceed with SHiP concluded a process that took more than a year, involving the Physics Beyond Colliders study group and the SPS and PS experiments committee. Two other experiments, HIKE and SHADOWS, were proposed to exploit the high-intensity beam from the SPS. Continuing the successful tradition of kaon experiments in the ECN3 hall, which currently hosts the NA62 experiment, HIKE (high-intensity kaon experiment) proposed to search for new physics in rare charged and neutral kaon decays while also allowing on-axis searches for hidden particles. For SHADOWS (search for hidden and dark objects with the SPS), which would have taken data concurrently with HIKE when the beamline is operated in beam-dump mode, the focus was low-background searches for off-axis hidden-sector particles in the MeV-GeV region.
“In terms of their science, SHiP and HIKE/SHADOWS were ranked equally by the relevant scientific committees,” explains CERN director for research and computing Joachim Mnich. “But a decision had to be made, and SHiP was a strategic choice for CERN.”
The LHC experiments have surpassed expectations in their ability to squeeze the most out of their large datasets, also demonstrating the wealth of scientific understanding to be gained from improvements to data-acquisition pipelines. Colliding proton bunches at a rate of 40 MHz, the LHC produces a huge quantity of data that must be filtered in real-time to levels that are manageable for offline computing and ensuing physics analysis. When the High-Luminosity LHC (HL-LHC) enters operation from 2029, the data rates and event complexity will further increase significantly.
To meet this challenge, the general-purpose LHC experiments ATLAS and CMS are preparing significant detector upgrades, which include improvements in the online filtering or trigger-selection processes. In view of the importance of this step, the collaborations seek to further enhance their trigger and analysis capabilities, and thus their scientific potential, beyond their currently projected scope.
Following a visit by a group of private donors, in 2023 CERN, in close collaboration with the ATLAS and CMS collaborations, submitted a proposal to the Eric and Wendy Schmidt Fund for Strategic Innovation, which resulted in the award of a $48 million grant. The donation laid the foundations of the Next Generation Triggers project, which kicked off in January 2024. The five-year-long project aims to accelerate novel computing, engineering and scientific ideas for the ATLAS and CMS upgrades, also taking advantage of advanced AI techniques, not only in large-scale data analysis and simulation but also embedded in front-end detector electronics. These include quantum-inspired algorithms to improve simulations, and heterogeneous computing architectures and new strategies to optimise the performance of GPU-accelerated experiment code. The project will also provide insight to detectors and data flows for future projects, such as experiments at the proposed Future Circular Collider, while the associated infrastructure will support the advancement of software and algorithms for simulations that are vital to the HL-LHC and future-collider physics programmes. Through the direct involvement of the CERN experimental physics, information technology and theory departments, it is expected that results from the project will bring benefits across the lab’s scientific programme.
The Next Generation Triggers project is broken down into four work packages: infrastructure, algorithms and theory (to improve machine learning-assisted simulation and data collection, develop common frameworks and tools, and better leverage available and new computing infrastructures and platforms); enhancing the ATLAS trigger and data acquisition (to focus on improved and accelerated filtering and exotic signature detection); rethinking the CMS real-time data processing (to extend the use of heterogeneous computing to the whole online reconstruction and to design a novel AI-powered real-time processing workflow to analyse every collision); and education programmes and outreach to engage the community, industry and academia in the ambitious goals of the project, foster and train computing skills in the next generation of high-energy physicists, and complement existing successful community programmes with multi-disciplinary subjects across physics, computing science and engineering.
“The Next Generation Triggers project builds upon and further enhances the ambitious trigger and data acquisition upgrades of the ATLAS and CMS experiments to unleash the full scientific potential of the HL-LHC,” says ATLAS spokesperson Andreas Hoecker.
“Its work packages also benefit other critical areas of the HL-LHC programme, and the results obtained will be valuable for future particle-physics experiments at the energy frontier,” adds Patricia McBride, CMS spokesperson.
CERN will have sole discretion over the implementation of the Next Generation Triggers scientific programme and how the project is delivered overall. In line with its Open Science Policy, CERN also pledges to release all IP generated as part of the project under appropriate open licences.
On 21 March the CERN Council decided to launch the process for updating the European strategy for particle physics – the cornerstone of Europe’s decision-making process for the long-term future of the field. Mandated by the CERN Council, the European strategy is formed through a broad consultation of the particle-physics community and in close coordination with similar processes in the US and Japan, to ensure coordination between regions and optimal use of resources globally.
The deadline for submitting written input for the next strategy update has been set for 31 March 2025, with a view to concluding the process in June 2026. The strategy process is managed by the strategy secretariat, which the Council will establish during its June 2024 session.
The European strategy process was initiated by the CERN Council in 2005, placing the LHC at the top of particle physics’ scientific priorities, with a significant luminosity upgrade already being mooted. A ramp-up of R&D for future accelerators also featured high on the priority list, followed by coordination with a potential International Linear Collider and participation in a global neutrino programme.
The final report of the FCC feasibility study will be a key input for the next strategy update
The first strategy update in 2013, which kept the LHC as a top priority and attached increasing importance to its high-luminosity upgrade, stated that Europe needs to be in a position to propose an ambitious post-LHC accelerator project at CERN by the time of the next strategy update. The latter charge was formulated in more detail in the second strategy update, completed in 2020, which recommended a Higgs factory as the highest priority to follow the LHC and that a technical and financial feasibility study should be pursued in parallel for a next-generation hadron collider at the highest achievable energy. A mid-term report on the resulting Future Circular Collider feasibility study was submitted for review at the end of 2023 (CERN Courier March/April 2024 pp25–38) and the final report, expected in March 2025, will be a key input for the next strategy update.
More information about the third update of the European strategy, together with the call for input, will be issued by the strategy secretariat in due course.
The Electron–Ion Collider (EIC), located at Brookhaven National Laboratory and being built in partnership with Jefferson Lab, has taken a step closer to construction. In April the US Department of Energy (DOE) approved “Critical Decision 3A”, which gives the formal go-ahead to purchase long-lead procurements for the facility.
The EIC will offer the unique ability to collide a beam of polarised high-energy electrons with polarised protons, polarised lightweight ions, or heavy ions. Its aim is to produce 3D snapshots or “nuclear femtography” of the inner structure of nucleons to gain a deeper understanding of how quarks and gluons give rise to properties such as spin and mass (CERN Courier October 2018 p31). The collider, which will make use of infrastructure currently used for the Relativistic Heavy Ion Collider and is costed at between $1.7 and 2.8 billion, is scheduled to enter construction in 2026 and to begin operations in the first half of the next decade.
By passing the latest DOE project milestone, the EIC project partners can now start ordering key components for the accelerator, detector and infrastructure. These include superconducting wires and other materials, cryogenic equipment, the experimental solenoid, lead-tungstate crystals and scintillating fibres for detectors, electrical substations and support buildings. “The EIC project can now move forward with the execution of contracts with industrial partners that will significantly reduce project technical and schedule risk,” said EIC project director Jim Yeck.
More than 1500 physicists from nearly 300 laboratories and institutes worldwide are members of the EIC user group. Earlier this year the DOE and the CNRS signed a statement of interest concerning the contribution of researchers in France, while the UK announced that it will invest £58.8 million to develop the necessary detector and accelerator technologies.
The BESIII collaboration has marked a significant milestone: the completion of its 15-year campaign to collect 20 fb–1 of e+e– collision data at the ψ(3770) resonance. The sample, collected in two main running periods, 2010–2011 and 2022–2024, is more than 20 times larger than the world’s previous charm-threshold data set collected by the CLEO-c experiment in the US.
BESIII is an experiment situated on the BEPCII storage ring at IHEP in Beijing. It involves more than 600 physicists drawn not only from China but also other nations, including Germany, Italy, Poland, the Netherlands, Sweden and the UK from the CERN member states. The detector has collected data at a range of running points with centre-of-mass energies from 1.8 to 4.95 GeV, most of which are inaccessible to other operating colliders. This energy regime allows researchers to make largely unique studies of physics above and below the charm threshold, and has led to important discoveries and measurements in light-meson spectroscopy, non-perturbative QCD, and charm and tau physics.
The ψ(3770), discovered at SLAC in 1977, is the lightest charmonium state above the open-charm threshold. Charmonium consists of a bound charm quark and anti-charm quark, whereas open-charm states such as D0 and D+ mesons are systems in which the charm quark co-exists with a different anti-quark. The ψ(3770) can decay into D and anti-D mesons, whereas charmonium states below threshold, such as the J/ψ, are too light to do so, and must instead decay through annihilation of the charm and anticharm quarks.
The sample is more than 20 times larger than the world’s previous charm-threshold data set
Open-charm mesons are also produced in copious quantities at the LHC and at Belle II. However, in ψ(3770) decays at BESIII they are produced in pairs, with no accompanying particles. This makes the BESIII sample a uniquely clean laboratory in which to study the properties of D mesons. If one meson is reconstructed, or tagged, in a known charm decay, the other meson in the event can be analysed in an unbiased manner. When reconstructed in a decay of interest, the unbiased sample of mesons can be used to measure absolute branching fractions and the relative phases between any intermediate resonances in the D decay.
“Both sets of information are not only interesting in themselves, but also vital for studies with charm and beauty mesons at LHCb and Belle II,” explains Guy Wilkinson of the University of Oxford. “For example, measurements of phase information performed by BESIII with the first tranche of ψ(3770) data have been essential input in the world-leading determination of the CP-violating angle γ of the unitarity triangle by LHCb in events where a beauty meson decays into a D meson and an accompanying kaon.” Exploitation of the full 20 fb–1 sample will be essential in helping LHCb and Belle II realise their full potential in CP-violation measurements with larger data sets in the future, he adds. “Hence BESIII is very complementary to the higher energy experiments, demonstrating the strong synergies that exist between particle-physics facilities worldwide.”
This summer, BEPCII will undergo an upgrade that will increase its luminosity. Over the rest of the decade more data will be taken above and below the charm threshold. In the longer term, there are plans, elsewhere in China, for a Super Tau Charm Facility – an accelerator that would build on the BEPCII and BESIII programme with datasets that are two orders of magnitude larger.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.