Comsol -leaderboard other pages

Topics

Tabletop experiment constrains neutrino size

The BeEST experiment

How big is a neutrino? Though the answer depends on the physical process that created it, knowledge of the size of neutrino wave packets is at present so wildly unconstrained that every measurement counts. New results from the Beryllium Electron capture in Superconducting Tunnel junctions (BeEST) experiment in TRIUMF, Canada, set new lower limits on the size of the neutrino’s wave packet in terrestrial experiments – though theorists are at odds over how to interpret the data.

Neutrinos are created as a mixture of mass eigenstates. Each eigenstate is a wave packet with a unique group velocity. If the wave packets are too narrow, they eventually stop overlapping as the wave evolves, and quantum interference is lost. If the wave packets are too broad, a single mass eigenstate is resolved by Heisenberg’s uncertainty principle, and quantum interference is also lost. No quantum interference means no neutrino oscillations.

“Coherence conditions constrain the lengths of neutrino wave packets both from below and above,” explains theorist Evgeny Akhmedov of MPI-K Heidelberg. “For neutrinos, these constraints are compatible, and the allowed window is very large because neutrinos are very light. This also hints at an answer to the frequently asked question of why charged leptons don’t oscillate.”

The spatial extent of the neutrino wavepacket has so far only been constrained to within 13 orders of magnitude by reactor-neutrino oscillations, say the BeEST team. If wave-packet sizes were at the experimental lower limit set by the world’s oscillation data, it could have impacted future oscillation experiments, such as the Jiangmen Underground Neutrino Observatory (JUNO) that is currently under construction in China.

“This could have destroyed JUNO’s ability to probe the neutrino mass ordering,” says Akhmedov, “however, we expect the actual sizes to be at least six orders of magnitude larger than the lowest limit from the world’s oscillation data. We have no hope of probing them in terrestrial oscillation experiments, in my opinion, though the situation may be different for astrophysical and cosmological neutrinos.”

BeEST uses a novel method to constrain the size of the neutrino wavepacket. The group creates electron neutrinos via electron capture on unstable 7Be nuclei produced at the TRIUMF–ISAC facility in Vancouver. In the final state there are only two products: the electron neutrino and a newly transmuted 7Li daughter atom that receives a tiny energy “kick” by emitting the neutrino. By embedding the 7Be isotopes in superconducting quantum sensors at 0.1 K, the collaboration can measure this low-energy recoil to high precision. Via the uncertainty principle, the team infers a limit on the spatial localisation of the entire final-state system of 6.2 pm – more than 1000 times larger than the nucleus itself.

Consensus has not been reached on how to infer the new lower limit on the size of the neutrino wave packet, with the preprint quoting two lower limits in the vicinity of 10–11 m and 10–8 m based on different theoretical assumptions. Although they differ dramatically, even the weaker limit improves upon all previous reactor oscillation data by more than an order of magnitude, and is enough to rule out decoherence effects as an explanation for sterile-neutrino anomalies, says the collaboration.

“I think the more stringent limit is correct,” says Akhmedov, who points out that this is only about 1.5 orders of magnitude lower than some theoretical predictions. “I am not an experimentalist and therefore cannot judge whether an improvement of 1.5 orders of magnitude can be achieved in the foreseeable future, but I very much hope that this is possible.”

Muons cooled and accelerated in Japan

In a world first, a research group working at the J-PARC laboratory in Tokai, Japan, has cooled and accelerated a beam of antimatter muons (µ+). Though muon cooling was first demonstrated by the Muon Ionisation Cooling Experiment in the UK in 2020 (CERN Courier March/April 2020 p7), this is the first time that the short-lived cousins of the electron have been accelerated after cooling – an essential step for applications in particle physics.

The cooling method is ingenious – and completely different to ionisation cooling, where muons are focused in absorbers to reduce their transverse momentum. Instead, µ+ are slowed to 0.002% of the speed of light in a thin silica-aerogel target, capturing atomic electrons to form muonium, an atom-like compound of an antimatter muon and an electron. Experimenters then ionise the muonium using a laser to create a near monochromatic beam that is reaccelerated in radiofrequency (RF) cavities. The work builds on the acceleration of negative muonium ions – an antimatter muon bonded to two electrons – which the team demonstrated in 2017 (CERN Courier July/August 2018 p8).

Though the analysis is still to be finalised, with results due to be published soon, the cooling and acceleration effect is unmistakable. In accelerator physics, cooling is traditionally quantified by a reduction in beam emittance – an otherwise conserved quantity that reflects the volume occupied by the beam in the abstract space of orthogonal displacements and momenta. Estimates indicate a beam cooling effect of more than an order of magnitude, with the beam then accelerated from 25 meV to 100 keV. The main challenge is transmission. At present one antimatter muon emerges from the RF for every 10 million, which impact the aerogel. Muon decay is also a challenge given that the muonium is nearly stationary in the laboratory frame, with time dilation barely extending the muon’s 2.2 μs lifetime. Roughly a third of the µ+ decay before exiting the J-PARC apparatus.

The first application of this technology will be the muon g-2/EDM experiment at J-PARC, where data taking is due to start in 2028. The experiment will add valuable data points to measurements thought to have exceptional sensitivity to new physics (CERN Courier May/June 2021 p25). In the case of the anomalous magnetic moment (g-2) of the muon, theoretical showdowns later this year may either dissipate or reinforce intriguing hints of beyond-the-Standard-Model physics from the Muon g-2 experiment at Fermilab, potentially adding strong motivation to an independent test.

We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success

“Although our current focus is the muon g-2/EDM experiment, we are open to any possible applications of this technology in the future,” says spokesperson Tsutomu Mibe of KEK. “We are communicating with experts to understand if our technology is of any use in a muon collider, but note that our method cannot be adapted for negative muons.”

While proposals for a µ+µ+ or µ+e collider exist, a µ+µ collider remains the most strongly motivated machine. “Much of the physics interest in e+e and µ+µ colliders comes from the annihilations of the initial particles into a photon and/or a Z boson, or a Higgs boson in the case of µ+µ,” says John Ellis of CERN/KCL. “These possibilities are absent for a µ+e or µ+µ+ collider, making them less interesting in my opinion.” From an accelerator-physics perspective, it remains to be demonstrated that the technique can deliver the beam intensity needed for an energy-frontier collider – not least while keeping the emittance low.

“We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success, says International Muon Collider study leader Daniel Schulte of CERN. “This will profit the development of muon-beam technology and use. We are in contact to understand how we can collaborate.”

Sustainable accelerator project underway

Particle accelerators have become essential instruments to improve our health, the environment, our safety and our high-tech abilities, as well as unlocking new, fundamental insights into physics, chemistry and biology, and generally enabling scientific breakthroughs that will improve our lives. Accelerating particles to higher energies will always require a large amount of energy. In a society where energy sustainability is critical, keeping energy consumption as low as is reasonably possible is an unavoidable challenge for both research infrastructures (RIs) and industry, which collectively operate more than 40,000 accelerators.

Going green

Based on state-of-the-art technology, the portfolio of current and future accelerator-driven RIs in Europe could develop to consume up to 1% of Germany’s annual electricity demand. With the ambition to maintain the attractiveness and competitiveness of European RIs, and enable Europe’s Green Deal, the Innovate for Sustainable Accelerating Systems (iSAS) project has been approved by Horizon Europe. Its aim is to establish an enhanced collaboration in the field to broaden, expedite and amplify the development and impact of novel energy-saving technologies to accelerate particles.

In general terms, a particle accelerator has a system to create the particles to be accelerated, a system preparing beams with these particles, an accelerating system that effectively accelerates the particle beams, a magnet system to steer the beam, an experimental facility using the particles, and finally a beam dump. In linear accelerating structures, most of the electrical power taken from the grid to operate the accelerator is used by the accelerating system itself.

The core of an accelerating system is a series of cavities that can deliver a high-gradient electric field. For many modern accelerators, these cavities are superconducting and therefore cryogenically cooled to about 2 K. They are powered with radio frequency (RF) power generators to deliver the field at a specific frequency and accordingly to provide energy to the particle beams as they traverse. These superconducting RF (SRF) systems are the enabling technology for frontier accelerators, but are energy-intensive devices where only a fraction of the power extracted from the grid is effectively transmitted to the accelerated particles. In addition, the beam energy is radiated by recirculating beams and ultimately dumped and lost. As an example, the European XFEL’s superconducting RF system uses 5–6 MW for 0.1 MW of average beam power, leading to a power conversion of less than 3%.

The objective of iSAS is to innovate those technologies that have been identified as being a common core of SRF accelerating systems and that have the largest leverage for energy savings with a view to minimising the intrinsic energy consumption in all phases of operation. In the landscape of accelerator-driven RIs, solutions are being developed to reuse the waste heat produced, develop energy-efficient magnets and RF power generators, and operate facilities on opportunistic schedules when energy is available on the grid. The iSAS project has a complementary focus on the energy efficiency of the SRF accelerating technologies themselves. This will contribute to the vital transition to sustain the tremendous 20th-century applications of accelerator technology in an energy-conscious 21st century.

Interconnected technologies

Based on a recently established European R&D roadmap for accelerator technology and based on a collaboration between leading European research institutions and industry, several interconnected technologies will be developed, prototyped and tested, each enabling significant energy savings on their own in accelerating particles. The collection of energy-saving technologies will be developed with a portfolio of forthcoming applications in mind, and to explore energy-saving improvements in accelerator-driven RIs. Considering the developments realised, the new technologies will be coherently integrated into the parametric design of a new accelerating system, a linac SRF cryomodule, optimised to achieve high beam-power in accelerators with an energy consumption that is as low as reasonably possible. This new cryomodule design will enable Europe to develop and build future energy-sustainable accelerators and particle colliders.

iSAS has been approved by Horizon Europe to help develop novel energy-saving technologies to accelerate particles

On 15 and 16 April, the iSAS kick-off meeting was organised at IJCLab (Orsay, France) with around 100 participants. Each of the working groups enthusiastically presented their impactful R&D plans and, in all cases, concrete work has begun. To save energy from RF power systems, novel fast-reacting tuners are being developed to compensate rapidly for detuning of the cavity’s frequency caused by mechanical vibrations, and methods are being invented to integrate them into smart digital control systems. To save energy from the cryogenics, and based on the ongoing Horizon Europe I.FAST project, superconducting cavities with thin films of Nb3Sn are being further developed to operate with high performance at 4.2 K instead of 2 K, thereby reducing the grid-power to operate the cryogenic system. The cryogenic system requires three times less cooling power to maintain a 4.2 K bath at 4.2 K when heat is dissipated in the bath compared to maintaining a 2 K bath at 2 K. Finally, to save energy from the accelerated particle beam itself, the technology of energy recovery linacs (ERLs) is being improved to operate efficiently with high-current beams by developing novel higher-order mode dampers that significantly avoid heat loads in the cavities.

iSAS logo

To address the engineering challenges related to the integration of the new energy-saving technologies, an existing ESS cryovessel will be equipped with new cavities and novel dampers, and the resulting linac SRF cryomodule will be tested in operation in the PERLE accelerator at IJCLab (Orsay, France). PERLE is a growing international collaboration to demonstrate the performance of ERLs with high-power beams that would enable applications in future particle colliders. Its first phase is being implemented at IJCLab with the objective to have initial beams in 2028.

The timescale to innovate, prototype and test new accelerator technologies is inherently long, in some cases longer than the typical duration of R&D projects. It is therefore essential to continue to collaborate and enhance the R&D process so that energy-sustainable technologies can be implemented without delay, to avoid hampering the scientific and industrial progress enabled by accelerators. Accordingly, iSAS plans co-development with industrial partners to jointly achieve a technology readiness level that will be sufficient to enter the large-scale production phase of these new technologies.

Empowering industry

While the readiness of several energy-saving technologies will be prepared towards industrialisation with impact on current RIs, iSAS is also a pathfinder for sustainable future SRF particle accelerators and colliders. Through inter- and multidisciplinary research that delivers and combines various technologies, it is the long-term ambition of iSAS to reduce the energy footprint of SRF accelerators in future RIs by half, and even more when the systems are integrated in ERLs. Accordingly, iSAS will help maintain Europe’s leadership for breakthroughs in fundamental sciences and enable high-energy collider technology to go beyond the current frontiers of energy and intensity in an energy-sustainable way. In parallel, the new sustainable technologies will empower and stimulate European industry to conceive a portfolio of new applications and take a leading role in, for example, the semiconductor, particle therapy, security and environmental sectors.

ATLAS turbocharges event simulation

ATLAS figure 1

As the harvest of data from the LHC experiments continues to increase, so does the required number of simulated collisions. This is a resource-intensive task as hundreds of particles must be tracked through complex detector geometries for each simulated physics collision – and Monte Carlo statistics must typically exceed experimental statistics by a factor of 10 or more, to minimise uncertainties when measured distributions are compared with theoretical predictions. To support data taking in Run 3 (2022–2025), the ATLAS collaboration therefore developed, evaluated and deployed a wide array of detailed optimisations to its detector-simulation software.

The production of simulated data begins with the generation of particles produced within the LHC’s proton–proton or heavy-ion collisions, followed by the simulation of their propagation through the detector and the modelling of the electronics signals from the active detection layers. Considerable computing resources are incurred when hadrons, photons and electrons enter the electromagnetic calorimeters and produce showers with many secondary particles whose trajectories and interactions with the detector material must be computed. The complex accordion geometry of the ATLAS electromagnetic calorimeter makes the Geant4 simulation of the shower development in the calorimeter system particularly compute-intensive, accounting for about 80% of the total simulation time for a typical collision event.

Since computing costs money and consumes electrical power, it is highly desirable to speed up the simulation of collision events without compromising accuracy. For example, considerable CPU resources were previously spent in the transportation of photons and neutrons; this has been mitigated by randomly removing 90% of the photons (neutrons) with energy below 0.5 (2) MeV and scaling up the energy deposited from the remaining 10% of low-energy particles. The simulation of photons in the finely segmented electromagnetic calorimeter took considerable time because the probabilities for each possible interaction process were calculated every time photons crossed a material boundary. That calculation time has been greatly reduced by using a uniform geometry with no photon transport boundaries and by determining the position of simulated interactions using the ratio of the cross sections in the various material layers. The combined effect of the optimisations brings an average speed gain of almost a factor of two.

ATLAS has also successfully used fast-simulation algorithms to leverage the available computational resources. Fast simulation aims at avoiding the compute-expensive Geant4 simulation of calorimeter showers by using parameterised models that are significantly faster and retain most of the physics performance of the more detailed simulation. However, one of the major limitations of the fast simulation employed by ATLAS during Run 2 was the insufficiently accurate modelling of physics observables such as the detailed description of the substructure of jets reconstructed with large-radius clustering algorithms.

AtlFast3 offers fast, high-precision physics simulations

For Run 3, ATLAS has developed a completely redesigned fast simulation toolkit, known as AtlFast3, which performs the simulation of the entire ATLAS detector. While the tracking systems continue to be simulated using Geant4, the energy response in the calorimeters is simulated using a hybrid approach that combines two new tools: FastCaloSim and FastCaloGAN.

FastCaloSim parametrises the longitudinal and lateral development of electromagnetic and hadronic showers, while the simulated energy response from FastCaloGAN is based on generative adversarial neural networks that are trained on pre-simulated Geant4 showers. AtlFast3 effectively combines the strengths of both approaches by selecting the most appropriate algorithm depending on the properties of the shower-initiating particles, tuned to optimise the performance of reconstructed observables, including those exploiting jet substructure. As an example, figure 1 shows that the hybrid AtlFast3 approach models the number of constituents of reconstructed jets as simulated with Geant4 very accurately.

With its significantly improved physics performance and a speedup between a factor of 3 (for Z ee events) and 15 (for high-pT di-jet events), AtlFast3 will play a crucial role in delivering high-precision physics simulations of ATLAS for Run 3 and beyond, while meeting the collaboration’s budgetary compute constraints.

How to surf to high energies

A laser ionises rubidium vapour, turning it into plasma. A proton bunch plunges inside, evolving into millimetre-long microbunches. The microbunches pull the plasma’s electrons, forming wakes in the plasma, like a speedboat displacing water. Crests and troughs of the plasma’s electric field trail the proton microbunches at almost the speed of light. If injected at just the right moment, relativistic electrons surf on the accelerating phase of the field over a distance of metres, gaining energy up to a factor of 1000 times faster than can be achieved in conventional accelerators.

Plasma wakefield acceleration is a cutting-edge technology that promises to revolutionise the field of particle acceleration by paving the way for smaller and more cost-effective linear accelerators. The technique traces back to a seminal paper published in 1979 by Toshiki Tajima and John Dawson which laid the foundations for subsequent breakthroughs. At its core, the principle involves using a driver to generate wakefields in a plasma, upon which a witness beam surfs to undergo acceleration. Since the publication of the first paper, the field has demonstrated remarkable success in achieving large accelerating gradients.

Traditionally, only laser pulses and electron bunches have been used as drive beams. However, since 2016 the Advanced Wakefield Experiment (AWAKE) at CERN has used proton bunches from the Super Proton Synchrotron (SPS) as drive beams – an innovative approach with profound implications. Thanks to their high stored energy, proton bunches enable AWAKE to accelerate an electron bunch to energies relevant for high-energy physics in a single plasma, circumventing the need for the multiple accelerating stages that are required when using lasers or electron bunches.

Bridging the divide

Relevant to any accelerator concept based on plasma wakefields, AWAKE technology promises to bridge the gap between global developments at small scales and possible future electron–positron colliders. The experiment is therefore an integral component of the European strategy for particle physics’ plasma roadmap, aiming to advance the concept to a level of technological maturity that would allow their application to particle-physics experiments. An international collaboration of approximately 100 people across 22 institutes worldwide, AWAKE has already published more than 90 papers, many in high-impact journals, alongside significant efforts to train the next generation, culminating in the completion of over 28 doctoral theses to date.

A proton bunch train

In the experiment, a 400 GeV proton bunch from the SPS is sent into a 10 m-long plasma source containing rubidium vapour at a temperature of around 200 °C (see “Rubidium source” figure). A laser pulse accompanies the proton bunch, ionising the vapour and transforming it into a plasma.

To induce the necessary wakefields, the drive bunch length must be of the order of the plasma wavelength, which corresponds to the natural oscillation period of the plasma. However, the length of the SPS proton bunch is around 6 cm, significantly longer than the 1 mm plasma wavelength in AWAKE, and short wavelengths are required to reach large accelerating gradients.

The solution is to take advantage of a beam-plasma instability, which transforms long particle bunches into microbunches with the period of the plasma through a process known as self-modulation. In other words, as the long proton bunch traverses the plasma, it can be coaxed into splitting into a train of shorter “microbunches”. The bunch train resonantly excites the plasma wave, like a pendulum or a child on a swing, being pushed with small kicks at its natural oscillation interval or resonant frequency. If applied at the right time, each kick increases the oscillation amplitude or height of the wave. When the amplitude is sufficiently high, a witness electron bunch from an external source is injected into the plasma wakefields, to ride the wakefields and gain energy.

AWAKE rubidium vapour source

The first phase of AWAKE (Run 1, from 2016 to 2018) served as a proof-of-concept demonstration of the acceleration scheme. First, it was shown that a plasma can be used as a compact device to self-modulate a highly relativistic and highly energetic proton bunch (see “Self-modulation” figure). Second, it was shown that the resulting bunch train resonantly excites strong wakefields. Third – the most direct demonstration – it was shown that externally injected electrons can be captured, focused and accelerated to GeV energies by the wakefields. The addition of a percent-level positive gradient in density along the plasma led to 20% boosts in the energy gained by the accelerated electrons.

Based on these proof-of-principle experimental results and expertise at CERN and in the collaboration, AWAKE developed a well-defined programme for Run 2, which launched in 2021 following Long Shutdown 2, and which will run for several more years from now. The goal is to achieve electron acceleration with GeV/m energy gain and beam quality similar to a normalised emittance of 10 mm-mrad and a relative energy spread of a few per cent. In parallel, scalable plasma sources are being developed that can be extended up to hundreds of metres in length (see “Helicon plasma source” and “Discharge source” figures). Once these goals are reached, the concepts of AWAKE could be used in particle-physics applications such as using electron beams with energy between 40 and 200 GeV impinging on a fixed target to search for new phenomena related to dark matter.

Controlled instability

The first Run 2 milestone, on track for completion by the end of the year, is to complete the self-modulator – the plasma that transforms the long proton bunch into a train of microbunches. The demonstration has been staged in two experimental phases.

The first phase was completed in 2022. The results prove that wakefields driven by a full proton bunch can have a reproducible and tunable timing. This is not at all a trivial demonstration given that the experiment is based on an instability!

Plasma wakefield acceleration

Techniques to tune the instability are similar to those used with free-electron lasers: provide a controlled initial signal for the instability to grow from and operate in the saturated regime, for example. In AWAKE, the self-modulation instability is initiated by the wakefields driven by an electron bunch placed ahead of the proton bunch. The wakefields from the electron bunch imprint themselves on the proton bunch right from the start, leading to a well defined bunch train. This electron bunch is distinct from the witness bunches, which are later accelerated.

The second experimental phase for the completion of the self-modulator is to demonstrate that high-amplitude wakefields can be maintained over long distances. Numerical simulations predict that self-modulation can be optimised by tailoring the plasma’s density profile. For example, introducing a step in the plasma density should lead to higher accelerating fields that can be maintained over long distances. First measurements are very encouraging, with density steps already leading to increased energy gains for externally injected electrons. Work is ongoing to globally optimise the self-modulator.

AWAKE technology promises to bridge the gap between global developments at small scales and possible future electron–positron colliders

The second experimental milestone of Run 2 will be the acceleration of an electron bunch while demonstrating its sustained beam quality. The experimental setup designed to reach this milestone includes two plasmas: a self-modulator that prepares the proton bunch train, and a second “accelerator plasma” into which an external electron bunch is injected (see “Modulation and acceleration” figure). To make space for the installation of the additional equipment, CERN will in 2025 and 2026 dismantle the CNGS (CERN Neutrinos to Gran Sasso) target area that is installed in a 100m-long tunnel cavern downstream from the AWAKE experimental facility.

Accelerate ahead

Two enabling technologies are needed to achieve high-quality electron acceleration. The first is a source and transport line to inject the electron bunch on-axis into the accelerator plasma. A radio-frequency (RF) injector source was chosen because of the maturity of the technology, though the combination of S-band and X-band structures is novel, and forms a compact accelerator with possible medical applications. It is followed by a transport line that preserves the parameters of the 150 MeV 100 pC bunch, and allows for its tight focusing (5 to 10 µm) at the entrance of the accelerator plasma. External injection into plasma-based accelerators is challenging because of the high frequency (about 235 GHz in AWAKE) and thus small structure size (roughly 200 µm) at which they operate. The main goal is to demonstrate that the electron bunch can be accelerated to 4 to 10 GeV, with a relative energy spread of 5 to 8%, and emerge with approximately the same normalised emittance as at the entrance of the plasma (2–30 mm mrad).

Prototype discharge plasma source

For these experiments, rubidium vapour sources will be used for both the self-modulator and accelerator plasmas, as they provide the uniformity, tunability and reproducibility required for the acceleration process. However, the laser-ionisation process of the rubidium vapour does not scale to lengths beyond 20 m. The alternative enabling technology is therefore a plasma source whose length can be scaled to the 50 to 100 metres required for the bunch to reach 50–100 GeV energies. To achieve this, a laboratory to develop discharge and helicon-plasma sources has been set up at CERN (see “Discharge source” figure). Multiple units can in principle be stacked to reach the desired plasma length. The challenge with such sources is to demonstrate that they can produce required plasma parameters other than length.

The third and final experimental milestone for Run 2 will then be to replace the 10 m-long accelerator plasma with a longer source and achieve proportionally larger energy gains. The AWAKE acceleration concept will then essentially be mature to propose particle-physics experiments, for example with bunches of a billion or so 50 GeV electrons.

Engineering materials for big science

HL-LHC coils up close

The nature of CERN’s research often demands unusual and highly complex materials to be developed and tested. A good example is the LHC beam screen that limits the energy transfer from the beam to the cold mass of the magnets, for which a new non-magnetic stainless steel had to be developed in the mid-1990s to meet the physical and mechanical requirements at cryogenic temperatures. The same is true of the external cylinder of the CMS solenoid magnet, for which a process enabling the production of 7 m-diameter high-strength seamless aluminium-alloy rings had to be identified and qualified. Another breakthrough at the LHC has been the solution adopted for the end covers of the cold masses of the dipole magnets, for which 2500 stainless-steel powder metallurgy-hot isostatic pressed covers were produced – qualifying this innovative shaping solution for the first time for massive, fully reliable leak-tight operation at cryogenic temperatures.

Similar challenges apply today for the High-Luminosity LHC (HL-LHC), which is due to operate from 2029. For the HL-LHC radio-frequency crab cavities, which will tilt the beams at the collision points to maximise the luminosity, niobium and niobium-titanium alloy products have been carefully identified and qualified. Niobium additive-manufactured at CERN achieved a record purity and conductivity for this kind of product. For the new HL-LHC magnets, which are necessary to focus the beams more tightly at the collision points, detailed qualifications of the soundness of niobium-tin (Nb3Sn) coils have been critical, as has the development and qualification of methods to test the weld of the quadrupole magnet cold masses.

These and numerous other projects are the domain of the CERN materials, metrology and non-destructive testing (EN–MME–MM) section, whose mission is to provide material sciences for accelerators and detectors spanning the whole CERN community, in close coordination with the mechanical design and production facilities of the EN-MME group. The interdisciplinary, expert-staffed section guarantees a full life-cycle management of mat­erials – from functional requirements to prototyping, series production, inspection and end-of-life – and includes the identification or development of material solutions, the specification and qualification of suppliers, the definition of manufacturing and inspection plans, and inspections of received materials and parts before and after their integration into the machines and experiments. This challenging mission requires advanced microscopic materials analysis, high-precision optical metrology, mechanical static and cyclic measurements, including at cryogenic temperatures, and, last but not least, state of the art non-destructive testing techniques (see “Section facilities” figure).

The facilities of the EN–MME–MM section

High-field magnets

The future of particle accelerators is strongly linked to the development of high–field superconducting magnets that enable higher energies and luminosities to be attained. The HL-LHC will be the first operational facility to employ high-performance Nb3Sn accelerator magnets, surpassing the intrinsic performance limitations of NbTi-based magnets as used for the LHC. The fabrication of Nb3Sn magnets is a challenging process because the conductor is an extremely brittle intermetallic phase. While the difficulty of working with brittle compounds is reduced using the traditional wind-react-and-impregnate approach, uncertainties remain due to volume changes associated with phase transformations occurring during the reaction heat treatment necessary to form the Nb3Sn phase.

Needle in a haystack

To investigate the root causes of performance limitation or degradation observed on early magnets, several HL-LHC dipole and quadrupole magnet coils were examined. This project has been one of the most complex failure analyses ever undertaken by the MM section, demanding an innovative investigation methodology to be identified and performed at several fabrication stages and after cool-down and powering. Internal shear and bending loads on unsupported superconducting wires, which can cause their dislocation as well as cracks in the aggregates of Nb3Sn filaments, were suspected to be the main cause of limitation or degradation. Like hunting for a needle within a massive haystack, the challenge was to find microscopic damage at the level of the filaments in the large volume of coils covering a length up to 7.2 m.

Dipole diagnostics

Starting in 2020 with 11 T magnet-coil ends, a sequence of mesoscale observations of whole coil sections was carried out non-destructively using innovative high-energy X-ray computed tomography (CT). This enabled the critical volumes to be identified and was followed up with a microscopic assessment of internal events, geometrical distortions and potential flaws using advanced microscopy. As a result, the MM section was able to unequivocally identify strands with transversely broken elements (see “Dipole diagnostics” and “Cracking niobium tin” figures). Techniques such as scanning electron microscopy (SEM) and focussed ion beam (FIB) were used to analyse damage to strands or sub-elements at particular localised positions as well as failure modes. In addition, a deep-etching technique allowed a decisive observation of completely broken filaments (see “HL-LHC coils up close” figure). Taken together, this comprehensive approach provided an in-depth view of the examined coils by identifying and characterising atypical features and imperfections in both the superconducting phase of the strands and the glass fibre/resin insulation system. It also clearly associated the quenches (a sudden loss of the superconducting state) experienced by the coils with physical events, namely broken superconducting filaments or damaged strands. The successful analysis of the CERN coil limitations led the MM section to receive several coils from different non-conforming quadrupole magnets, fabricated in the US in the framework of the Accelerator Upgrade Project collaboration, and successfully carry out the same type of investigations.

Cracking niobium tin

Effective recovery

This highly effective approach and key results on Nb3Sn accelerator magnets were made possible thanks to the wide experience gained with previous applications of CT techniques to the magnet system of the ITER fusion experiment, which employs the Nb3Sn conductor on a massive scale. The aim of such investigations is not only to understand what went wrong, no matter how difficult and complex that might be, but also to identify remedial actions. For the HL-LHC magnets, the MM section has contributed widely to the introduction of effective recovery measures, improved coil manufacturing and cold-mass assembly processes, and the production of magnets with reproducible behaviour and no sign of degradation. These results led to the conclusion that the root cause of the performance limitation of previous long CERN magnets has been identified and can now be overcome for future applications, as is the case for Nb3Sn quadrupole magnets.

Structural support

Investigating the massive HL-LHC coils required a highenergy (6 MeV) linac CT that was subcontracted to TEC Eurolab in Italy and Diondo GmbH in Germany, two of only a few companies in the world that are equipped with this technique. However, the MM section also has an X-ray CT facility with an energy of 225 keV, which enables sufficient penetration for less massive samples. One of the most recent of countless examples employing this technique concerns the staves for the future ATLAS tracker (ITk) for the HL-LHC upgrade. During 2023 a significant fraction of the ITk modules suffered from early high-voltage breakdowns, despite appearing to perform satisfactorily during earlier stages of quality control. A subset of these modules exhibited breakdowns following thermal cycling, with some failing during the cold phases of the cycle. Additionally, others experienced breakdowns after being loaded onto their supporting staves. High-resolution CT scans at CERN combined with other techniques confirmed the presence and propagation of cracks through the entire sensor thickness, and enabled the MM team to identify the gluing process between the carbon structure and the sensors as the root cause of the vulnerability, which is now being addressed by the ATLAS project team (see “ATLAS modules” figure). Also for the HL-LHC, the section is working on the internalisation process of the beryllium vacuum-chamber fabrication technology required for the experiments.

ATLAS modules

While carrying out failure analyses of extremely high-tech components is the core business of the MM section, in some cases understanding the failure of the most basic objects can be paramount. This does not necessarily mean that the investigations are simpler. At 11 a.m. on 13 October 2022, a pipe supplying CERN with water burst under the main road near the French–Swiss border, which was closed until early afternoon. The damage was quickly repaired by the Swiss services, and the road re-opened. But it was critical to understand if this was an isolated incident of an individual pipe, in service for 20 years, or if there was the potential risk of bursts in other ducts of the same type.

The services of the MM section, provided via cooperation agreements with CERN, are in wide demand externally

The damaged portion of the duct, measuring 1.7 m in length and 0.5 m in diameter, is the largest sample ever brought to the MM facilities for root-cause analysis (see “Water pipe” figure). As such, it required most of the available techniques to be deployed. For the receiving inspections, visual and radiographic testing and high-precision optical dimensional metrology in a volume of almost 17 m3 were used. For microstructural examinations, tests by CT, microoptical and SEM observations on the samples surrounding the crack – including a post-resin burn-off test – were carried out. The cracking (one of the most common found in water and sewer pipes) turned out to be the result of bending forces due to local soil movement. This generated a flexural constraint between the supported ends of the failing section, consisting of a concrete base on one side and a connection sleeve to the next pipe section on the opposite side. The change of boundary conditions may have been due to droughts during summer periods that altered the soil conditions. To the great relief of all, the composite material of the pipe or its constituents were not the main cause of the failure.

Beyond CERN

The services of the MM section, provided via cooperation agreements with CERN, are also in wide demand externally. ITER is a strong example. As of 2009, a major multi-year cooperation agreement is in place specifically covering metallurgical and material testing for the construction of the ITER magnet and vacuum systems. The many results and achievements of this long-lasting cooperation include: the qualification of high-strength stainless-steel jacket material for the conductor of the ITER central solenoid, including their cryogenic properties; the development and application of advanced examination techniques to assess the vacuum pressure impregnation process used in the correction coils and their critical welds, which are not inspectable with conventional techniques; and the assessment of a high-strength austenitic stainless steel for the precompression structure of the central solenoid, involving forgings featuring an unprecedented combination of size and aspect ratio. The section has also been fully entrusted by the ITER organisation for major failure analysis, such as the root-cause analysis of a heavy gauge fastener of the toroidal-field gravity support system and, more recently, the analysis of leakage events in the thermal-shield cooling pipes of the ITER magnet system. Several agreements are also in place via the CERN knowledge transfer group for the assessment of structural materials for a fusion project beyond ITER, and for a subcritical fission reactor project.

Failure analysis of a water pipe

Also not to be forgotten is the major involvement of CERN in the Einstein Telescope project, for example in assessing suitable materials and fabrication solutions for its vacuum system, one of the largest ultra-high vacuum systems ever built. A three-year-long project that started in September 2022 aims to deliver the main technical design report for the Einstein Telescope beampipes, in which CERN’s contribution is structured in eight work packages spanning design and materials choice to logistics, installation and surface treatments (CERN Courier September/October 2023 p45).

Beyond fundamental physics, the section is also working on the selection of materials for a future hydrogen economy, namely the definition of the proper specification and procedures for operation in a liquid-hydrogen environment. The watchmaking industry, which places high requirements on materials, also cooperates in this field. It is expected that the section will also receive requests for even more collaboration projects for different fields.

It is quite true to say that materials are everywhere. The examples given here clearly show that in view of the ambitious goals of CERN, a highly interdisciplinary effort from materials and mechanical engineers is paramount to the proper selection and qualification of materials, parts and processes to enable the creation of the giant colliders and detectors that allow physicists to explore the fundamental constituents of the universe.

Next-generation triggers for HL-LHC and beyond

ATLAS and CMS events at 13.6 TeV

The LHC experiments have surpassed expectations in their ability to squeeze the most out of their large datasets, also demonstrating the wealth of scientific understanding to be gained from improvements to data-acquisition pipelines. Colliding proton bunches at a rate of 40 MHz, the LHC produces a huge quantity of data that must be filtered in real-time to levels that are manageable for offline computing and ensuing physics analysis. When the High-Luminosity LHC (HL-LHC) enters operation from 2029, the data rates and event complexity will further increase significantly.

To meet this challenge, the general-purpose LHC experiments ATLAS and CMS are preparing significant detector upgrades, which include improvements in the online filtering or trigger-selection processes. In view of the importance of this step, the collaborations seek to further enhance their trigger and analysis capabilities, and thus their scientific potential, beyond their currently projected scope.

Following a visit by a group of private donors, in 2023 CERN, in close collaboration with the ATLAS and CMS collaborations, submitted a proposal to the Eric and Wendy Schmidt Fund for Strategic Innovation, which resulted in the award of a $48 million grant. The donation laid the foundations of the Next Generation Triggers project, which kicked off in January 2024. The five-year-long project aims to accelerate novel computing, engineering and scientific ideas for the ATLAS and CMS upgrades, also taking advantage of advanced AI techniques, not only in large-scale data analysis and simulation but also embedded in front-end detector electronics. These include quantum-inspired algorithms to improve simulations, and heterogeneous computing architectures and new strategies to optimise the performance of GPU-accelerated experiment code. The project will also provide insight to detectors and data flows for future projects, such as experiments at the proposed Future Circular Collider, while the associated infrastructure will support the advancement of software and algorithms for simulations that are vital to the HL-LHC and future-collider physics programmes. Through the direct involvement of the CERN experimental physics, information technology and theory departments, it is expected that results from the project will bring benefits across the lab’s scientific programme.

The Next Generation Triggers project is broken down into four work packages: infrastructure, algorithms and theory (to improve machine learning-assisted simulation and data collection, develop common frameworks and tools, and better leverage available and new computing infrastructures and platforms); enhancing the ATLAS trigger and data acquisition (to focus on improved and accelerated filtering and exotic signature detection); rethinking the CMS real-time data processing (to extend the use of heterogeneous computing to the whole online reconstruction and to design a novel AI-powered real-time processing workflow to analyse every collision); and education programmes and outreach to engage the community, industry and academia in the ambitious goals of the project, foster and train computing skills in the next generation of high-energy physicists, and complement existing successful community programmes with multi-disciplinary subjects across physics, computing science and engineering.

“The Next Generation Triggers project builds upon and further enhances the ambitious trigger and data acquisition upgrades of the ATLAS and CMS experiments to unleash the full scientific potential of the HL-LHC,” says ATLAS spokesperson Andreas Hoecker.

“Its work packages also benefit other critical areas of the HL-LHC programme, and the results obtained will be valuable for future particle-physics experiments at the energy frontier,” adds Patricia McBride, CMS spokesperson.

CERN will have sole discretion over the implementation of the Next Generation Triggers scientific programme and how the project is delivered overall. In line with its Open Science Policy, CERN also pledges to release all IP generated as part of the project under appropriate open licences.

EIC steps towards construction

A schematic of the future Electron–Ion Collider

The Electron–Ion Collider (EIC), located at Brookhaven National Laboratory and being built in partnership with Jefferson Lab, has taken a step closer to construction. In April the US Department of Energy (DOE) approved “Critical Decision 3A”, which gives the formal go-ahead to purchase long-lead procurements for the facility.

The EIC will offer the unique ability to collide a beam of polarised high-energy electrons with polarised protons, polarised lightweight ions, or heavy ions. Its aim is to produce 3D snapshots or “nuclear femtography” of the inner structure of nucleons to gain a deeper understanding of how quarks and gluons give rise to properties such as spin and mass (CERN Courier October 2018 p31). The collider, which will make use of infrastructure currently used for the Relativistic Heavy Ion Collider and is costed at between $1.7 and 2.8 billion, is scheduled to enter construction in 2026 and to begin operations in the first half of the next decade.

By passing the latest DOE project milestone, the EIC project partners can now start ordering key components for the accelerator, detector and infrastructure. These include supercon­ducting wires and other materials, cryogenic equipment, the experimental solenoid, lead-tungstate crystals and scintillating fibres for detectors, electrical substations and support buildings. “The EIC project can now move forward with the execution of contracts with industrial partners that will significantly reduce project technical and schedule risk,” said EIC project director Jim Yeck.

More than 1500 physicists from nearly 300 laboratories and institutes worldwide are members of the EIC user group. Earlier this year the DOE and the CNRS signed a statement of interest concerning the contribution of researchers in France, while the UK announced that it will invest £58.8 million to develop the necessary detector and accelerator technologies.

New subdetectors to extend ALICE’s reach

ALICE components

The LHC’s dedicated heavy-ion experiment, ALICE, is to be equipped with an upgraded inner tracking system and a new forward calorimeter to extend its physics reach. The upgrades have been approved for installation during the next long shutdown from 2026 to 2028.

With 10 m2 of active silicon and nearly 13 billion pixels, the current ALICE inner tracker, which has been in place since 2021, is the largest pixel detector ever built. It is also the first detector at the LHC to use monolithic active pixel sensors (MAPS) instead of the more traditional hybrid pixels and silicon microstrips. The new inner tracking system, ITS3, uses a novel stitching technology to construct MAPS of 50 µm thickness and up to 26 × 10 cm2 in area that can be bent around the beampipe in a truly cylindrical shape. The first layer will be placed just 2 mm from the beampipe and 19 mm from the interaction point, with a much lighter support structure that significantly reduces the material volume and therefore its effect on particle trajectories. Overall, the new system will boost the pointing resolution of the tracks by a factor of two compared to the present ITS detector, strongly enhancing measurements of thermal radiation emitted by the quark–gluon plasma and enabling insights into the interactions of charm and beauty quarks as they propagate through it.

The new forward calorimeter, FoCal, is optimised for photon detection in the forward direction. It consists of a highly granular electromagnetic calorimeter, composed of 18 layers of 1 × 1 cm2 silicon-pad sensors paired with tungsten converter plates and two additional layers of 30 × 30 μm2 pixels, and a hadronic calorimeter made of copper capillary tubes and scintillating fibres. By measuring inclusive photons and their correlations with neutral mesons, as well as the production of jets and charmonia, FoCal will add new capabilities to explore the small Bjorken-x parton structure of nucleons and nuclei.

Technical design reports for the ITS3 and FoCal projects were endorsed by the relevant CERN review committees in March. The construction phase has now started, with the detectors due to be installed in early 2028 in order to be ready for data taking in 2029. The upgrades, in particular ITS3, are also an important step on the way to ALICE 3 – a major proposed upgrade of ALICE that, if approved, would enter operation in the mid-2030s.

Accelerator sustainability in focus

The world is facing a crisis of anthropogenic climate change, driven by excessive CO2 emissions during the past 150 years. In response, the United Nations has defined goals in a race towards zero net-carbon emission. One of these goals is to ensure that all projects due to be completed by 2030 or after have a net-zero carbon operation, with a reduction in embodied carbon by at least 40% compared to current practice. At the same time, the European Union (EU), Japan and other nations have decided to become carbon neutral by around 2050.

These boundary conditions put large-scale science projects under pressure to reduce CO2 emissions during construction, operation and potentially decommissioning. For context: given the current French energy mix, CERN’s annual 1.3 TWh electricity consumption (which is mostly used for accelerator operation) corresponds to roughly 50 kt CO2e global warming potential (GWP), while recent estimates for the construction of tunnels for future colliders are in the multi-100 kt CO2e GWP range.

Green realisation

To discuss potential ways forward, a Workshop on Sustainability for Future Accelerators (WSFA2023) took place on 25–27 September in Morioka, Japan within the framework of the recently started EU project EAJADE (Europe–America–Japan Accelerator Development and Exchange). Around 50 international experts discussed a slew of topics ranging from life-cycle assessments (LCAs) of accelerator technologies with carbon-reduction potential to funding initiatives towards sustainable accelerator R&D, and local initiatives aimed at the “green” realisation of future colliders. With the workshop being held in Japan, the proposed International Linear Collider (ILC) figured prominently as a reference project – attracting considerable attention from local media.

The general context of discussions was set by Beate Heinemann, DESY director for particle physics, on behalf of the European Laboratory Directors Group (LDG). The LDG recently created a working group to assess the sustainability of accelerators, with a mandate to develop guidelines and a minimum set of key indicators pertaining to the methodology and scope of reporting of sustainability aspects for future high-energy physics projects. Since LCAs are becoming the main tool to estimate GWP, a number of project representatives discussed their take on sustainability and steps towards performing LCAs. Starting with the much-cited ARUP study on linear colliders published in 2023 (edms.cern.ch/document/2917948/1), there were presentations on the ESS in Sweden, the ISIS-II neutron and muon source in the UK, the CERN sustainability forum, the Future Circular Collider, the Cool Copper Collider and other proposed colliders. Also discussed were R&D items for sustainable technologies, including CERN’s High Efficiency Klystron Project, the ZEPTO permanent-magnet project, thin film-coated SRF cavities and others.

A second big block in the workshop agenda was devoted to the “greening” of future accelerators and potential local and general construction measures towards achieving this goal. The focus was on Japanese efforts around the ILC, but numerous results can be re-interpreted in a more general way. Presentations were given on the potential of concrete to turn from a massive carbon source into a carbon sink with net negative CO2e balance (a topic with huge industrial interest), on large-scale wooden construction (e.g. for experimental halls), and on the ILC connection with the agriculture, forestry and fisheries industries to reduce CO2 emissions and offset them by increasing CO2 absorption. The focus was on building an energy recycling society by the time the ILC would become operational.

What have we learnt on our way towards sustainable large-scale research infrastructures? First, that time might be our friend: energy mixes will include increasingly larger carbon-free components, making construction projects and operations more eco-friendly. Also, new and more sustainable technologies will be developed that help achieve global climate goals. Second, we as a community must consider the imprint our research leaves on the globe, along with as many indicators as possible. The GWP can be a beginning, but there are many other factors relating, for example, to rare-earth elements, toxicity and acidity. The LCA methodology provides the accelerator community with guidelines for the planning of more sustainable large-scale projects and needs to be further developed – including end-of-life, decommissioning and recycling steps – in an appropriate manner. Last but not least, it is clear that we need to be proactive in anticipating the changes happening in the energy markets and society with respect to sustainability-driven challenges at all levels.

bright-rec iop pub iop-science physcis connect