Comsol -leaderboard other pages

Topics

Adapting CLIC tech for FLASH therapy

Walter Wuensch

About 30–40% of people will develop cancer during their lifetimes. Surgery, chemotherapy, immunotherapy and radiotherapy (RT) are used to cure or manage the disease. But around a third of cancers are multi-resistant to all forms of therapies, defining a need for more efficient and better tolerated treatments. Technological advances in the past decade or so have transformed RT into a precise and powerful treatment for cancer patients. Nevertheless, the treatment of radiation-resistant tumours is complicated by the need to limit doses to surrounding normal tissue.

A paradigm-shifting technique called FLASH therapy, which is able to deliver doses of radiation in milliseconds instead of minutes as for conventional RT, is opening new avenues for more effective and less toxic RT. Pre-clinical studies have shown that the extremely short exposure time of FLASH therapy spares healthy tissue from the hazardous effect of radiation without reducing its efficacy on tumours.

First studied in the 1970s, it is only during the past few years that FLASH therapy has caught the attention of oncologists. The catalyst was a 2014 study carried out by researchers from Lausanne University Hospital (CHUV), Switzerland, and from the Institute Curie in Paris, which showed an outstanding differential FLASH effect between tumours and normal tissues in mice. The results were later confirmed by several other leading institutes. Then, in 2019, CHUV used FLASH to treat a multi-resistant skin cancer in a human patient, causing the tumour to completely disappear with nearly no side effects.

The consistency of pre-clinical data showing a striking protection of normal tissues with FLASH compared to conventional RT offers a new opportunity to improve cancer treatment, especially for multi-resistant tumours. The very short “radiation beam-on-time” of FLASH therapy could also eliminate the need for motion management, which is currently necessary when irradiating tumours that move with respiration. Furthermore, since FLASH therapy operates best with high single doses, it requires only one or two RT sessions as opposed to multiple sessions over a period of several weeks in the case of conventional RT. This promises to reduce oncology workloads and patient waiting lists, while improving treatment access in low-population density environments. Altogether, these advantages could turn FLASH therapy into a powerful new tool for cancer treatment, providing a better quality of life for patients.

The key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility

CERN and CHUV join forces

CHUV is undertaking a comprehensive research program to translate FLASH therapy to a clinical environment. No clinical prototype is currently available for treating patients with FLASH therapy, especially for deep-seated tumours. Such treatments require very high-energy beams (see p12) and face technological challenges that can currently be solved only by a very limited number of institutions worldwide. As the world’s largest particle-physics laboratory, CERN is one of them. In 2019, CHUV and CERN joined forces with the aim of building a high-energy, clinical FLASH facility.

The need to deliver a full treatment dose over a large area in a short period of time demands an accelerator that can produce a high-intensity beam. Amongst the current radiation tools available for RT – X-rays, electrons, protons and ions – electrons stand out for their unique combination of attributes. Electrons with an energy of around 100 MeV penetrate many tens of centimetres in tissue so have the potential to reach tumours deep inside the body. This is also true for the other radiation modalities but it is technically simpler to produce intense beams of electrons. For example, electron beams are routinely used to produce X-rays in imaging systems such as CT scanners and in industrial applications such as electron beam-welding machines. In addition, it is comparatively simple to accelerate electrons in linear accelerators and guide them using modest magnets. A FLASH-therapy facility based on 100 MeV-range electrons is therefore a highly compelling option.

Demonstrating the unexpected practical benefits of fundamental research, the emergence of FLASH therapy as a potentially major clinical advance coincides with the maturing of accelerator technology developed for the CLIC electron–positron collider. In a further coincidence, the focus of FLASH development has been at CHUV, in Lausanne, and CLIC development at CERN, in Geneva, just 60 km away. CLIC is one of the potential options for a post-LHC collider and the design of the facility, as well as the development of key technologies, has been underway for more than 20 years. A recent update of the design, now optimized for a 380 GeV initial-energy stage, and updated prototype testing were completed in 2018.

Despite the differences in scale and application, the key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility. First, CLIC requires high-luminosity collisions, for example to allow the study of rare interaction processes. This is achieved by colliding very high-intensity and precisely controlled beams: the average current during a pulse of CLIC is 1 A and the linac hardware is designed to allow two beams less than 1 nm in diameter to collide at the interaction point. High levels of current that are superbly controlled are also needed for FLASH to cover large tumours in short times. Second, CLIC requires a high accelerating gradient (72 MV/m in the initial stage) to achieve its required collision energy in a reasonably sized facility (11 km for a 380 GeV first stage). A FLASH facility using 100 MeV electrons based on an optimised implementation of the same technology requires an accelerator of just a couple of metres long. Other system elements such as diagnostics, beam shaping and delivery as well as radiation shielding make the footprint of the full facility somewhat larger. Overall, however, the compact accelerator technology developed for CLIC gives the possibility of clinical facilities built within the confines of typical hospital campus and integrated with existing oncology departments.

Over the decades, CLIC has invested significant resources into developing its high-current and high-gradient technology. Numerous high-power radio-frequency test stands have been built and operated, serving as prototypes for the radio-frequency system units that make up a linear accelerator. The high-current-beam test accelerator “CTF3” enabled beam dynamic simulation codes to be benchmarked and the formation, manipulation and control of very intense electron beams to be demonstrated. Further beam-dynamics validations and relevant experiments have been carried out at different laboratories including ATF2 at KEK, FACET at SLAC and ATF at Argonne. CERN also operates the Linear Electron Accelerator for Research (CLEAR) facility, where it can accelerate electrons up to 250 MeV, thus matching the energy requirements of FLASH radiotherapy. For the past several years, and beyond the collaboration between CERN and CHUV, the CLEAR facility has been involved in dosimetry studies for FLASH radiotherapy. 

Towards a clinical facility

All of this accumulated experience and expertise is now being used to design and construct a FLASH facility. The collaboration between CERN and CHUV is a shining example of knowledge transfer, where technology developed for fundamental research is used to develop a therapeutic facility. While the technical aspects of the project have been defined via exchanges between medical researchers and accelerator experts, the CERN knowledge-transfer group and CHUV’s management have addressed contractual aspects and identified a strategy for intellectual property ownership. This global approach provides a clear roadmap for transforming the conceptual facility into a clinical reality. From the perspective of high-energy physics, the adoption of CLIC technology in commercially supplied medical facilities would significantly reduce technological risk and increase the industrial supplier base. 

An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed

The collaboration between CHUV and CERN was catalysed by a workshop on FLASH therapy hosted by CHUV in September 2018, when it was realised that an electron-beam facility based on CLIC technology offers the possibility for a high-performance clinical FLASH facility. An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed to study the possibilities in greater depth. In an intense exchange during the months following the workshop, where requirements and capabilities were brought together and balanced, a clear picture of the parameters of a clinical FLASH facility emerged. Subsequently, the team studied critical issues in detail, validating that such a facility is in fact feasible. It is now working towards the details of a baseline design, with parameters specified at the system level, and the implementation of entirely new perspectives that were triggered by the study. A conceptual design report for the facility will be finished by the end of 2020. CHUV is actively seeking funding for the facility, which would require approximately three years for construction through beam commissioning.

The basic accelerator elements of the 100 MeV-range FLASH facility that emerged from this design process consist of: a photo-injector electron source; a linac optimised for high-current transport and maximum radio-frequency-power to beam-energy-transfer efficiency; and a beam-delivery system which forms the beam shape for individual treatment and directs it towards the patient. In addition, accelerator and clinical instrumentation are being designed which must work together to provide the necessary level of precision and repeatability required for patient treatment. This latter issue is of particular criticality in FLASH treatment, which must be administered with all feedback and correction of delivered dose to clinical levels completed in substantially less than a second. The radiation field is one area where the requirements of CLIC and FLASH are quite different. In CLIC the beam is focused to a very small spot (roughly 150 nm wide and 3 nm high) for maximum luminosity, whereas in FLASH the beam must be expanded to cover a large area (up to 10 cm) of irregular cross section and with high levels of dose uniformity. Although this requires a very different implementation of the beam-delivery systems, both CLIC and FLASH are designed using the same beam-dynamics tools and design methodologies. 

Many challenges will have to be overcome, not least obtaining regulatory approval for such a novel system, but we are convinced that the fundamental ideas are sound and that the goal is within reach. A clinical FLASH facility based on CLIC technology is set to be an excellent example of the impact of developments made in the pursuit of fundamental science can have in society.

A unique period for computing, but will it last?

Monica Marinucci and Ivan Deloose

Twenty-five years ago in Rio de Janeiro, at the 8th International Conference on Computing in High-Energy and Nuclear Physics (CHEP-95), I presented a paper on behalf of my research team titled “The PC as Physics Computer for LHC”. We highlighted impressive improvements in price and performance compared to other solutions on offer. In the years that followed, the community started moving to PCs in a massive way, and today the PC remains unchallenged as the workhorse for high-energy physics (HEP) computing.

HEP-computing demands have always been greater than the available capacity. However, our community does not have the financial clout to dictate the way computing should evolve, demanding constant innovation and research in computing and IT to maintain progress. A few years before CHEP-95, RISC workstations and servers had started complementing the mainframes that had been acquired at high cost at the start-up of LEP in 1989. We thought we could do even better than RISC. The increased-energy LEP2 phase needed lots of simulation, and the same needs were already manifest for the LHC. These were our inspirations that led PC servers to start populating our computer centres – a move that was also helped by a fair amount of luck.

Fast change

HEP programs need good floating-point compute capabilities and early generations of the Intel x86 processors, such as the 486/487 chips, offered mediocre capabilities. The Pentium processors that emerged in the mid-1990s changed the scene significantly, and the competitive race between Intel and AMD was a major driver of continued hardware innovation.

Another strong tailwind came from the relentless efforts to shrink transistor sizes in line with Moore’s law, which saw processor speeds increase from 50/100 MHz to 2000/3000 MHz in little more than a decade. After 2006, when speed increases became impossible for thermal reasons, efforts moved to producing multi-core chips. However, HEP continued to profit. Since all physics events at colliders such as the LHC are independent of all others, it was sufficient to split a job into multiple jobs across all cores.

Sverre Jarp

The HEP community was also lucky with software. Back in 1995 we had chosen Windows/NT as the operating system, mainly because it supported multiprocessing, which significantly enhanced our price/performance. Physicists, however, insisted on Unix. In 1991, Linus Thorvalds released Linux version 0.01 and it quickly gathered momentum as a worldwide open-source project. When release 2.0 appeared in 1996, multiprocessing support was included and the operating system was quickly adopted by our community.

Furthermore, HEP adopted the Grid concept to cope with the demands of the LHC. Thanks to projects such as Enabling Grids for E-science, we built the Worldwide LHC Computing Grid, which today handles more than two million tasks across one million PC cores every 24 hours. Although grid computing remained mainly amongst scientific users, the analogous concept of cloud computing had the same cementing effect across industry. Today, all the major cloud-computing providers overwhelmingly rely on PC servers.

In 1995 we had seen a glimmer, but we had no idea that the PC would remain an uncontested winner during a quarter of a century of scientific computing. The question is whether it will last for another quarter century?

The contenders

The end of CPU scaling, argued a recent report by the HEP Software Foundation, demands radical changes in computing and software to ensure the success of the LHC and other experiments into the 2020s and beyond. There are many contenders that would like to replace the x86 PC architecture. It could be graphics processors, where both Intel, AMD and Nvidia are active. A wilder guess is quantum computing, whereas a more conservative guess would be processors similar to the x86, but based on other architectures, such as ARM or RISC-V.

The end of CPU scaling demands radical changes to ensure the success of the LHC and other high-energy physics experiments

During the PC project we collaborated with Hewlett-Packard, which had a division in Grenoble, not too far away. Such R&D collaborations have been vital to CERN and the community since the beginning and they remain so today. They allow us to get insight into forthcoming products and future plans, while our feedback can help to influence the products in plan. CERN openlab, which has been the focal point for such collaborations for two decades, early-on coined the phrase “You make it, we break it”. However, whatever the future holds, it is fair to assume that PCs will remain the workhorse for HEP computing for many years to come.

Neutrinos for peace

The PROSPECT neutrino detector

The first nuclear-weapons test shook the desert in New Mexico 75 years ago. Weeks later, Hiroshima and Nagasaki were obliterated. So far, these two Japanese cities have been the only ones to suffer such a fate. Neutrinos can help to ensure that no other city has to be added to this dreadful list.

At the height of the arms race between the US and the USSR, stockpiles of nuclear weapons exceeded 50,000 warheads, with the majority being thermonuclear designs vastly more destructive than the fission bombs used in World War II. Significant reductions in global nuclear stockpiles followed the end of the Cold War, but the US and Russia still have about 12,500 nuclear weapons in total, and the other seven nuclear-armed nations have about 1500. Today, the politics of non-proliferation is once again tense and unpredictable. New nuclear security challenges have appeared, often from unexpected actors, as a result of leadership changes on both sides of the table. Nuclear arms races and the dissolution of arms-control treaties have yet again become a real possibility. A regional nuclear war involving just 1% of the global arsenal would cause a massive loss of life, trigger climate effects leading to crop failures and jeopardise the food supply of a billion people. Until we achieve global disarmament, nuclear non-proliferation efforts and arms control are still the most effective tools for nuclear security.

Not a bang but a whimper

The story of the neutrino is closely tied to nuclear weapons. The first serious proposal to detect the particle hypothesised by Pauli, put forward by Clyde Cowan and Frederick Reines in the early 1950s, was to use a nuclear explosion as the source (see “Daring experiment” figure). Inverse beta decay, whereby an electron-antineutrino strikes a free proton and transforms it into a neutron and a positron, was to be the detection reaction. The proposal was approved in 1952 as an addition to an already planned atmospheric nuclear-weapons test. However, while preparing for this experiment, Cowan and Reines realised that by capturing the neutron on a cadmium nucleus, and observing the delayed coincidence between the positron and this neutron, they could use the lower, but steady flux of neutrinos from a nuclear reactor instead (see “First detection” figure). This technique is still used today, but with gadolinium or lithium in place of cadmium.

Proposal to discover particles using a nuclear explosion

The P reactor at the Savannah River site at Oak Ridge National Laboratory, which had been built and used to make plutonium and tritium for nuclear weapons, eventually hosted the successful experiment to first detect the neutrino in 1956. Neutrino experiments testing the properties of the neutrino including oscillation searches continued there until 1988, when the P reactor was shut down.

Neutrinos are not produced in nuclear fission itself, but by the beta decays of neutron-rich fission fragments – on average about six per fission. In a typical reactor fuelled by natural uranium or low-enriched uranium, the reactor starts out with only uranium-235 as its fuel. During operation a significant number of neutrons are absorbed on uranium-238, which is far more abundant, leading to the formation of uranium-239, which after two beta decays becomes plutonium-239. Plutonium-239 eventually contributes to about 40% of the fissions, and hence energy production, in a commercial reactor. It is also the isotope used in nuclear weapons.

The dual-use nature of reactors is at the crux of nuclear non-proliferation. What distinguishes a plutonium-production reactor from a regular reactor producing electricity is whether it is operated in such a way that the plutonium can be taken out of the reactor core before it deteriorates and becomes difficult to use in weapons applications. A reactor with a low content of plutonium-239 makes more and higher energy neutrinos than one rich in plutonium-239.

Lev Mikaelyan and Alexander Borovoi, from the Kurchatov Institute in Moscow, realised that neutrino emissions can be used to infer the power and plutonium content of a reactor. In a series of trailblazing experiments at the Rovno nuclear power plant in the 1980s and early 1990s, their group demonstrated that a tonne-scale underground neutrino detector situated 10 to 20 metres from a reactor can indeed track its power and plutonium content.

The significant drawback of neutrino detectors in the 1980s was that they needed to be situated underground, beneath a substantial overburden of rock, to shield them from cosmic rays. This greatly limited potential deployment sites. There was a series of application-related experiments – notably the successful SONGS experiment conducted by researchers at Lawrence Livermore National Laboratory, which aimed to reduce cost and improve the robustness and remote operation of neutrino detectors – but all of these detectors still needed shielding.

From cadmium to gadolinium

Synergies with fundamental physics grew in the 1990s, when the evidence for neutrino oscillations was becoming impossible to ignore. With the range of potential oscillations frequencies narrowing, the Palo Verde and Chooz reactor experiments placed multi-tonne detectors about 1 km from nuclear reactors, and sought to measure the relatively small θ13 parameter of the neutrino mixing matrix, which expresses the mixing between electron neutrinos and the third neutrino mass eigenstate. Both experiments used large amounts of liquid organic scintillator doped with gadolinium. The goal was to tag antineutrino events by capturing the neutrons on gadolinium, rather than the cadmium used by Reines and Cowan. Gadolinium produces 8 MeV of gamma rays upon de-excitation after a neutron capture. As it has an enormous neutron-capture cross section, even small amounts greatly enhance an experiment’s ability to identify neutrons.

Delayed coincidence detection scheme

Eventually, neutrino oscillations became an accepted fact, redoubling the interest in measuring θ13. This resulted in three new experiments: Double Chooz in France, RENO in South Korea, and Daya Bay in China. Learning lessons from Palo Verde and Chooz, the experiments successfully measured θ13 more precisely than any other neutrino mixing parameter. A spin-off from the Double Chooz experiment was the Nucifer detector (see “Purpose driven” figure), which demonstrated the operation of a robust sub-tonne-scale detector designed with missions to monitor reactors in mind, in alignment with requirements formulated at a 2008 workshop held by the International Atomic Energy Agency (IAEA). However, Nucifer still needed a significant overburden.

In 2011, however, shortly before the experiments established that θ13 is not zero, fundamental research once again galvanised the development of detector technology for reactor monitoring. In the run-up to the Double Chooz experiment, a group at Saclay started to re-evaluate the predictions for reactor neutrino fluxes – then and now based on measurements at the Institut Laue-Langevin in the 1980s – and found to their surprise that the reactor flux prediction came out 6% higher than before. Given that all prior experiments were in agreement with the old flux predictions, neutrinos were missing. This “reactor-antineutrino anomaly” persists to this day. A sterile neutrino with a mass of about 1 eV would be a simple explanation. This mass range has been suggested by experiments with accelerator neutrinos, most notably LSND and MiniBooNE, though it conflicts with predictions that muon neutrinos should oscillate into such a sterile neutrino, which experiments such as MINOS+ have failed to confirm.

To directly observe the high-frequency oscillations of an eV-scale sterile neutrino you need to get within about 10 m of the reactor. At this distance, backgrounds from the operation of the reactor are often non-negligible, and no overburden is possible – the same conditions a detector on a safeguards mission would encounter.

From gadolinium to lithium

Around half a dozen experimental groups are chasing sterile neutrinos using small detectors close to reactors. Some of the most advanced designs use fine spatial segmentation to reject backgrounds, and replace gadolinium with lithium-6 as the nucleus to capture and tag neutrons. Lithium has the advantage that upon neutron capture it produces an alpha particle and a triton rather than a handful of photons, resulting in a very well localised tag. In a small detector this improves event containment and thus efficiency, and also helps constrain event topology.

Following the lithium and finely segmented technical paths, the PROSPECT collaboration and the CHANDLER collaboration (see “Rapid deployment” figure), in which I participate, independently reported the detection of a neutrino spectrum with minimal overburden and high detection efficiency in 2018. This is a major milestone in making non-proliferation applications a reality, since it is the first demonstration of the technology needed for tonne-scale detectors capable of monitoring the plutonium content of a nuclear reactor that could be universally deployed without the need for special site preparation.

The story of the neutrino is closely tied to nuclear weapons

The main difference between the two detectors is that PROSPECT, which reported its near-final sterile neutrino limit at the Neutrino 2020 conference, uses a traditional approach with liquid scintillator, whereas CHANDLER, currently an R&D project, uses plastic scintillator. The use of plastic scintillator allows the deployment time-frame to be shortened to less than 24 hours. On the other hand, liquid scintillator allows the exploitation of pulse-shape discrimination to reject cosmic-ray neutron backgrounds, allowing PROSPECT to achieve a much better signal-to-background ratio than any plastic detector to date. Active R&D is seeking to improve topological reconstruction in plastic detectors and imbue them with pulse-shape discrimination. In addition, a number of safeguard-specific detector R&D experiments have successfully detected reactor neutrinos using plastic scintillator in conjunction with gadolinium. In the UK, the VIDARR collaboration has seen neutrinos from the Wylfa reactor, and in Japan the PANDA collaboration successfully operated a truck-mounted detector.

In parallel to detector development, studies are being undertaken to understand how reactor monitoring with neutrinos would impact nuclear security and support non-proliferation objectives. Two very relevant situations being studied are the 2015 Iran Deal – the Joint Comprehensive Plan of Action (JCPOA) – and verification concepts for a future agreement with North Korea.

Nuclear diplomacy

One of the sticking points in negotiating the 2015 Iran deal was the future of the IR-40 reactor, which was being constructed at Arak, an industrial city in central Iran. The IR-40 was planned to be a 40 MW reactor fuelled by natural uranium and moderated with heavy water, with a stated purpose of isotope production for medical and scientific use. The choice of fuel and moderator is interesting, as it meshes with Iranian capabilities and would serve the stated purpose well and be cost effective, since no uranium enrichment is needed. Equally, however, if one were to design a plutonium-production reactor for a nascent weapons programme, this combination would be one of the top choices: it does not require uranium enrichment, and with the stated reactor power would result in the annual production of about 10 kg of rather pure plutonium-239. This matches the critical mass of a bare plutonium-239 sphere, and it is known that as little as 4 kg can be used to make an effective nuclear explosive. Within the JCPOA it was eventually agreed that the IR-40 could be redesigned, down-rated in power to 20 MW and the new core fuelled with 3.7% enriched fuel, reducing the annual plutonium production by a factor of six.

A spin off from Double Chooz

A 10 to 20 tonne neutrino detector 20 m from the reactor would be able to measure its plutonium content with a precision of 1 to 2 kg. This would be particularly relevant in the so-called N-th month scenario, which models a potential crisis in Iran based on events in North Korea in June 1994. During the 1994 crisis, which risked precipitating war with the US, the nuclear reactor at Yongbyon was shut down, and enough spent fuel rods removed to make several bombs. IAEA protocols were sternly tested. The organisation’s conventional safeguards for operating reactors consist of containment and surveillance – seals, for example, to prevent the unnoticed opening of the reactor, and cameras to record the movement of fuel, most crucially during reactor shutdowns. In the N-th month scenario, the IR-40 reactor, in its pre-JCPOA configuration (40 MW, rather than the renegotiated power of 20 MW), runs under full safeguards for N–1 months. In month N, a planned reactor shutdown takes place. At this point the reactor would contain 8 kg of weapons-grade plutonium. For unspecified reasons the safeguards are then interrupted. In month N+1, the reactor is restarted and full safeguards are restored. The question is: are the 8 kg of plutonium still in the reactor core, or has the core been replaced with fresh fuel and the 8 kg of plutonium illicitly diverted?

The disruption of safeguards could either be due to equipment failure – a more frequent event than one might assume – or due to events in the political realm ranging from a minor unpleasantness to a full-throttle dash for a nuclear weapon. Distinguishing the two scenarios would be a matter of utmost urgency. According to an analysis including realistic backgrounds extrapolated from the PROSPECT results, this could be done in 8 to 12 weeks with a neutrino detector.

Neutrino detectors could be effective in addressing the safeguard challenges presented by advanced reactors

No conventional non-neutrino technologies can match this performance without shutting the reactor down and sampling a significant fraction of the highly radioactive fuel. The conventional approach would be extremely disruptive to reactor operations and would put inspectors and plant operators at risk of radiation exposure. Even if the host country were to agree in principle, developing a safe plan and having all sides agree on its feasibility would take months at the very least, creating dangerous ambiguity in the interim and giving hardliners on both sides time to push for an escalation of the crisis. The conventional approach would also be significantly more expensive than a neutrino detector.

New negotiating gambit

The June 1994 crisis at Yongbyon still overshadows negotiations with North Korea, since, as far as North Korea is concerned, it discredited the IAEA. Both during the crisis, and subsequently, international attempts at non-proliferation failed to prevent North Korea from acquiring nuclear weapons – its first nuclear-weapons test took place in 2006 – or even to constrain its progress towards a small-scale operational nuclear force. New approaches are therefore needed, and recent attempts by the US to achieve progress on this issue prompted an international group of about 20 neutrino experts from Europe, the US, Russia, South Korea, China and Japan to develop specific deployment scenarios for neutrino detectors at the Yongbyon nuclear complex.

The main concern is the 5 MWe reactor, which, though named for its electrical power, has a thermal power of 20 MW. This gas-cooled graphite-moderated reactor, fuelled with natural uranium, has been the source of all of North Korea’s plutonium. The specifics of this reactor, and in particular its fuel cladding, which makes prolonged wet-storage of irradiated fuel impossible, represent such a proliferation risk that anything but a monitored shutdown prior to a complete dismantling appears inappropriate. To safeguard against the regime reneging on such a deal, were it to be agreed, a relatively modest tonne-scale neutrino detector right outside the reactor building could detect a powering up of this reactor within a day.

The MiniCHANDLER detector

North Korea is also constructing the Experimental Light Water Reactor at Yongbyon. A 150 MW water-moderated reactor running with low-enriched fuel, this reactor would not be particularly well suited to plutonium production. Its design is not dissimilar to much larger reactors used throughout the world to produce electricity, and it could help address the perennial lack of electricity that has limited the development and growth of the country’s economy. North Korea may wish to operate it indefinitely. A larger, 10 tonne neutrino detector could detect any irregularities during its refuelling – a tell-tale sign of a non-civilian use of the reactor – on a timescale of three months, which is within the goals set by the IAEA.

In a different scenario, wherein the goal would be to monitor a total shutdown of all reactors at Yongbyon, it would be feasible to bury a Daya-Bay-style 50 tonne single volume detector under the Yak-san, a mountain about 2 km outside of the perimeter of the nuclear installations (see “A different scenario” figure). The cost and deployment timescale would be more onerous than in the other scenarios.

In the case of longer distances between reactor and detector, detector masses must increase to compensate an inverse-square reduction in the reactor-neutrino flux. As cosmic-ray backgrounds remain constant, the detectors must be deployed deep underground, beneath an overburden of several 100 m of rock. To this end, the UK’s Science and Technology Facilities Council, the UK Atomic Weapons Establishment and the US Department of Energy, are funding the WATCHMAN collaboration to pursue the construction of a multi-kilo-tonne water-Cherenkov detector at the Boulby mine, 20 km from two reactors in Hartlepool, in the UK. The goal is to demonstrate the ability to monitor the operational status of the reactors, which have a combined power of 3000 MW. In a use-case context this would translate to excluding the operation of an undeclared 10 to 20 MW reactor within a radius of a few kilometres , but no safeguards scenario has emerged where this would give a unique advantage.

Inverse-square scaling eventually breaks down around 100 km, as at that distance the backgrounds caused by civilian reactors far outshine any undeclared small reactor almost anywhere in the northern hemisphere. Small signals also prevent the use of neutrino detectors for nuclear-explosion monitoring, or to confirm the origin of a suspicious seismic event as being nuclear, as conventional technologies are more feasible than the very large detectors that would be needed. A more promising future application of neutrino-detector technology is to meet the new challenges posed by advanced nuclear-reactor designs.

Advanced safeguards

The current safeguards regime relies on two key assumptions: that fuel comes in large, indivisible and individually identifiable units called “fuel assemblies”, and that power reactors need to be refuelled frequently. Most advanced reactor designs violate at least one of these design characteristics. Fuel may come in thousands of small pebbles or be molten, and its coolant may not be transparent, in contrast to current designs, where water is used as moderator, coolant and storage medium in the first years after discharge. Either way, counting and identification of the fuel by serial number may be impossible. And unlike current power reactors, which are refuelled on a 12-to-18-month cycle, allowing in-core fuel to be verified as well, advanced reactors may be refuelled only once in their lifetime.

Three 20 tonne neutrino detectors

Neutrino detectors would not be hampered by any of these novel features. Detailed simulations indicate that they could be effective in addressing the safeguard challenges presented by advanced reactors. Crucially, they would work in a very similar fashion for any of the new reactor designs.

In 2019 the US Department of Energy chartered and funded a study (which I co-chair) with the goal of determining the utility of the unique capabilities offered by neutrino detectors for nuclear security and energy applications. This study includes investigators from US national laboratories and academia more broadly, and will engage and interview nuclear security and policy experts within the Department of Energy, the State Department, NGOs, academia, and international agencies such as the IAEA. The results are expected early in 2021. They should provide a good understanding of where neutrinos can play a role in current and future monitoring and verification agreements, and may help to guide neutrino detectors towards their first real-world applications.

The idea of using neutrinos to monitor reactors has been around for about 40 years. Only very recently, however, as a result of a surge of interest in sterile neutrinos, has detector technology become available that would be practical in real-world scenarios such as the JCPOA or a new North Korean nuclear agreement. The most likely initial application will be near-field reactor monitoring with detectors inside the fence of the monitored facility as part of a regional nuclear deal. Such detectors will not be a panacea to all verification and monitoring needs, and can only be effective if there is a sincere political will on both sides, but they do offer more room for creative diplomacy, and a technology that is robust against the kinds of political failures which have derailed past agreements. 

CLIC lights the way for FLASH therapy

High-gradient accelerating structure

Technology developed for the proposed Compact Linear Collider (CLIC) at CERN is poised to make a novel cancer radio‑therapy facility a reality. Building on recently revived research from the 1970s, oncologists believe that ultrafast bursts of electrons damage tumours more than healthy tissue. This “FLASH effect” could be realised by using high-gradient accelerator technology from CLIC to create a new facility at Switzerland’s Lausanne University Hospital (CHUV).

Traditional radiotherapy scans photon beams from multiple angles to focus a radiation dose on tumours inside the body. More recently, hadron therapy has offered a further treatment modality: by tuning the energy of a beam of protons or ions so that they stop in the tumour, the particles deposit most of the radiation dose there (the so-called Bragg peak), while sparing the surrounding healthy tissue by comparison. Both of these treatments deliver small doses of radiation to a patient over an extended period, whereas FLASH radiotherapy is thought to require a maximum of three doses, all lasting less than 100 ms.

Look again

When the FLASH effect was first studied in the 1970s, it was assumed that all tissues suffer less damage when a dose is ultrafast, regardless of whether they are healthy or tumorous. In 2014, however, CHUV researchers published a study in which 200 mice were given a single dose of 4.5 MeV gamma rays at a conventional therapy dose-rate, while others were given an equivalent dose at the much faster FLASH-therapy rate. The results showed explicitly that while the normal tissue was damaged significantly less by the ultrafast bursts, the damage to the tumour stayed consistent for both therapies. In 2019, CHUV applied the first FLASH treatment to a cancer patient, finding similarly positive results: a 3.5 cm diameter skin tumour completely disappeared using electrons from a 5.6 MeV linear accelerator, “with nearly no side effects”. The challenge was to reach deeper tumours.

Now, using high-gradient “X-band” radio-frequency cavity technology developed for CLIC, CHUV has teamed up with CERN to develop a facility that can produce electron beams with energies around 100 MeV, in order to reach tumour depths of up to 20 cm. The idea came about three years ago when it was realised that CLIC technology was almost a perfect match for what CHUV were looking for: a high-powered accelerator, which uses X-band technology to accelerate particles over a short distance, has a high luminosity, and utilises a high current that allows a higher volume of tumour to be targeted.

“CLIC has the ability to accelerate a large amount of charge to get enough luminosity for physics studies,” explains Walter Wuensch of CERN, who heads the FLASH project at CERN. “People tend to focus on the accelerating gradient, but as important, or arguably more important, is the ability to control high-current, low-emittance beams.”

It really looks like it has the potential to be an important complement to existing radiation therapies

The first phase of the collaboration is nearing completion, with a conceptual design report, funded by CHUV, being created together by CERN and CHUV. The development and construction of the first facility, which would be housed at CHUV, is predicted to cost around €25 million, and CHUV aims to complete the facility within three years.

“The intention of CERN and the team is to be heavily involved in the process of getting the facility built and operating,” states Wuensch. “It really looks like it has the potential to be an important complement to existing radiation therapies.”

Cancer therapies have taken advantage of particle accelerators for many decades, with proton radiotherapy entering the scene in the 1990s. The CERN-based Proton-Ion Medical Machine Study, spawned by the TERA Foundation, resulted in the National Centre for Cancer Hadron Therapy (CNAO) in Italy and MedAustron in Austria, which have made significant progress in the field of proton and ion therapy. FLASH radiotherapy would add electrons to the growing modality of particle therapy.

TESLA’s high-gradient march

Superconducting RF cavities

Energetic beams of charged particles are essential for high-energy physics research, as well as for studies of nuclear structure and dynamics, and deciphering complex molecular structures. In principle, generating such beams is simple: provide an electric field for acceleration and a magnetic field for bending particle trajectories. In practice, however, the task becomes increasingly challenging as the desired particle energy goes up. Very high electric fields are required to attain the highest energy beams within practical real-estate constraints.

The most efficient way to generate the very high electric fields in a vacuum environment required to transport a beam is to build up a resonant excitation of radio waves inside a metallic cavity. There is something of an art to shaping such cavities to “get the best bang for the buck” for a particular application. The radio-frequency (RF) fields are inherently time-varying, and bunches of charged particles need to arrive with the right timing if they are to see only forward-accelerating electric fields. Desirable very high resonant electric fields (e.g. 5–40 MV/m) require the existence of very high currents in the cavity walls. These currents are simply not sustainable for long durations using even the best normal-conducting materials, as they would melt from resistive heating.

Superconducting materials, on the other hand, can support sustainable high-accelerating gradients with an affordable electricity bill. Early pioneering work demonstrating the first beam-acceleration using superconducting radio-frequency (SRF) cavities took place in the late 1960s and early 1970s at Stanford, Caltech, the University of Wuppertal and Karlsruhe. The potential for real utility was clear, but techniques and material refinements were needed. Several individual laboratories began to take up the challenge for their own research needs. Solutions were developed for electron acceleration at CESR, HERA, TRISTAN, LEP II and CEBAF, while heavy-ion SRF acceleration solutions were developed at Stony Brook, ATLAS, ALPI and others. The community of SRF accelerator physicists was small but the lessons learned were consistently shared and documented. By the early 1990s, SRF technology had matured such that complex large-scale systems were credible and the variety of designs and applications began to blossom.

The TESLA springboard

In 2020, the TESLA Technology Collaboration (TTC) celebrates 30 years of collaborative efforts on SRF technologies. The TTC grew out of the first international TESLA (TeV Energy Superconducting Linear Accelerator) workshop, which was held at Cornell University in July 1990. Its aim was to define the parameters for a superconducting linear collider for high-energy physics operating in the TeV region and to explore how to increase the gradients and lower the costs of the accelerating structures. It was clear from the beginning that progress would require a large international collaboration, and the Cornell meeting set in motion a series of successes that are ongoing to this day – including FLASH and the European XFEL at DESY. The collaboration also led to proposals for several large SRF-based research facilities including SNS, LCLS-II, ESS, PIP-II and SHINE, as well as a growing number of smaller facilities around the world.

Accelerating gradients above 40 MV/m are now attainable with niobium

At the time of the first TESLA collaboration meeting, the state-of-the-art in accelerating gradients for electrons was around 5 MV/m in the operating SRF systems of TRISTAN at KEK, HERA at DESY, LEP-II at CERN and CEBAF at Jefferson Lab (JLab), which were then under construction. Many participants in this meeting agreed to push for a five-fold increase in the design accelerating gradient to 25 MV/m to meet the dream goal for TESLA at a centre-of-mass energy of 1 TeV. The initial focus of the collaboration was centred on the design, construction and commissioning of a technological demonstrator, the TESLA Test Facility (TTF) at DESY. In 2004, SRF was selected as the basis for an International Linear Collider (ILC) design and, shortly afterwards, the TESLA collaboration was re-formed as the TESLA Technology Collaboration with a scope beyond the original motivation of high-energy physics. The TTC, with its incredible worldwide collaboration spirit, has had a major role in the growth of the SRF community, facilitating numerous important contributions over the past 30 years.

30 years of gradient march

Conceptually, the objective of simply providing “nice clean” niobium surfaces on RF structures seems pretty straightforward. Important subtleties begin to emerge, however, as one considers that the high RF-surface currents required to support magnetic fields up to ~100 mT flow only in the top 100 nm of the niobium surface, which must offer routine surface resistances at the nano-ohm level over areas of around 1 m2. Achieving blemish-free, contamination-free surfaces that present excellent crystal lattice structure even in this thin surface layer is far from easy.

The march of progress in cavity gradient for linacs and the many representative applications over the past 50 years (see figure “Gradient growth”) are due to breakthroughs in three main areas: material purity, fabrication and processing techniques. The TTC had a major impact on each of these areas.

RF linac accelerating gradient achievements

With some notable exceptions, bulk niobium cavities fabricated from sheet stock material have been the standard, even though the required metallurgical processes present challenges. Cycles of electron-beam vacuum refining, rolling, and intermediate anneals are provided by only a few international vendors. Pushing up the purity of deliverable material required a concerted push, resulting in the avoidance of foreign material inclusions, which can be deadly to performance when uncovered in the final step of surface processing. The figure-of-merit for purity is the ratio of room-temperature to cryogenic normal-conducting resistivity – the residual resistance ratio, RRR. The common cavity-grade niobium material specification has thus come to be known as high-RRR grade.

Another later pursuit of pure niobium is the so-called “large grain” or “direct-from-ingot” material. Rather than insist on controlled ~30 µm grain-size distribution (grains being microcrystals in the structure), this mat­erial uses sheet slices cut directly from large ingots having much larger, but arbitrarily sized, grains. Although not yet widely used, this material has produced the highest gradient TESLA-style cavities to date – 45 MV/m with a quality factor Q0 > 1010. Here again, though the topic was initiated at JLab, this fruitful work was accomplished via worldwide international collaborations.

As niobium is a refractory metal that promptly cloaks itself with about 4 nm of dielectric oxide, welding niobium components has to be performed by vacuum electron beam welding. Collaborative efforts in Europe, North America and Asia refined the parameters required to yield consistent niobium welds. The community gradually realised that extreme cleanliness is required in the surface-weld preparation, since even microscopic foreign material will be vaporised during the weld process, leaving behind small voids that become performance-limiting defects.

Having the best niobium is not sufficient, however. Superconductors have inherent critical magnetic field limitations, or equivalently local surface-current density limitations. Because the current flow is so shallow, local magnetic field enhancements induced by microscopic topography translate into gradient-limiting quench effects. Etching of fabricated surfaces has routinely required a combination of hydrofluoric and nitric acids, buffered with phosphoric acid. This exothermic etching process inherently yields step-edge faceting at grain boundaries, which in turn creates local, even nanoscopic, field enhancements, anomalous losses and quenches as the mean surface field is increased. A progression of international efforts at KEK, DESY, CEA-Saclay and JLab eliminated this problem through the development of electro-polishing techniques. Following a deeper understanding of the underlying electrochemistry, accelerating gradients above 40 MV/m are now attainable with niobium.

Another vexing problem that TTC member institutions helped to solve was the presence of “Q-drop” in the region of high surface magnetic field, for which present explanations point to subtle migration of near-surface oxygen deeper into the lattice, where it inhibits the subsequent formation of lossy nanohydrides on cool-down. Avoidance of nanohydrides, whose superconductivity by proximity effect breaks down in the Q-drop regime, is required to sustain accelerating gradients above 25 MV/m for some structures.

Cleaning up

TTC members have also shared analyses and best practices in cleaning and cleanroom techniques, which have evolved dramatically during the past 30 years. This has helped to beat down the most common challenge for developers and users of SRF accelerating cavities: particulate-induced field emission, whereby very high peak surface electric fields can turn even micron-scale foreign material into parasitic electron field emission sources, with resulting cryogenic and radiation burdens. Extended interior final rinsing with high-pressure ultra-pure water prior to cavity assembly has become standard practice, while preparation and assembly of all beamline vacuum hardware under ISO 4 cleanroom conditions is necessary to maintain these clean surfaces for accelerator operations.

ESS elliptical section

The most recent transformation has come with the recognition that interstitial doping of the niobium surface with nitrogen can reduce SRF surface resistance much more than was dreamed possible, reducing the cryogenic heat load to be cooled. While still the subject of material research, this new capability was rapidly adopted into the specification for LCLS-II cavities and is also being considered for an ILC. The effort started in the US and quickly propagated internationally via the TTC, for example in cavity tests at the European Spallation Source (see “Vertical test” image). Earlier this year, Q-values of 3–4 × 1010 at 2 K at 30 MV/m were reported in TESLA-style cavities – representing tremendous progress, but with much optimisation still to be carried out.

One of the main goals of the TTC has been to bridge the gap between state-of-the-art R&D on laboratory prototypes and actual accelerator components in operating facilities, with the clear long-term objective to enable superconducting technology for a TeV-scale linear collider. This objective demanded a staged approach and intense work on the development of all the many peripherals and subcomponents. The collaboration embraced a joint effort between the initial partners to develop the TTF at DESY, which aimed to demonstrate reliable operation of an electron superconducting linac at gradients above 15 MV/m in “vector sum” control – whereby many cavities are fed by a single high-power RF source to improve cost effectiveness. In 1993 the collaboration finalised a 1.3 GHz cavity design that is still the baseline of large projects like the European XFEL, LCLS-II and SHINE, and nearly all L-band-based facilities.

Towards a linear collider

An intense collaborative effort started for the development of all peripheral components, for example power couplers, high-order mode dampers, digital low-level RF systems and cryomodules with unprecedented heat load performances. Several of these components were designed by TTC partners in an open collaborative and competitive effort, and a number of them can be found in existing projects around the world. The tight requirements imposed by the scale of a linear collider required an integrated design of the accelerating modules, containing the cavities and their peripheral components, which led to the concept of the “TESLA style” cryomodules, variants of which provide the building blocks of the linacs in TTF, European XFEL, LCLS-II and SHINE.

Half-wave resonator string assembly

The success of the TTF, which delivered its first beam in 1997, led it to become the driver for a next-generation light source at DESY, the VUV-FEL, which produced first light in 2005 and which later became the FLASH facility. The European XFEL built on this strong heritage, its large scale demanding a new level of design consolidation and industrialisation. It is remarkable to note that the total number of such TESLA-style cavities installed or to be installed in presently approved accelerators is more than 1800. Were a 250 GeV ILC to go ahead in Japan, approximately 8000 such units would be required. (Note that an alternative proposal for a high-energy linear collider, the Compact Linear Collider, relies on a novel dual-beam acceleration scheme that does not require SRF cavities.)

Since the partners collaborating on the early TESLA goal of a linear collider were also involved in other national and international projects for a variety of applications and domains, the first decade of the 21st century saw the TTC broaden its reach. For example, we started including reports from other projects, most notably the US Spallation Neutron Source, and gradually opened to the community working on low-beta ion and proton superconducting cavities, such as the half-wave resonator string collaboratively developed at Argonne National Lab and now destined for use in PIP-II at Fermilab (see “Low-beta cavities” image). TTC meetings include topical sessions with industries to discuss how to shorten the path from development to production. Recently, the TTC has also begun to facilitate collaborative exchanges on alternative SRF materials to bulk niobium, such as Nb3Sn and even hybrid multilayer films, for potential accelerator applications.

Sustaining success

The mission of the TTC is to advance SRF technology R&D and related accelerator studies across the broad diversity of scientific applications. It is to provide a bridge for open communication and sharing of ideas, development and testing across associated projects. The TTC supports and encourages the free and open exchange of scientific and technical knowledge, engineering designs and equipment. Furthermore, it is based on cooperative work on SRF accelerator technology by research groups at TTC member institution laboratories and test facilities. The current TTC membership consists of 60 laboratories and institutes in 12 countries across Europe, North America and Asia. Since progress in cavity performance and related SRF technologies is so rapid, the major TTC meetings have been frequent.

Distribution of superconducting particle accelerators

Particle accelerators using SRF technologies have been applied widely, from small facilities for medical applications up to large-scale projects for particle physics, nuclear physics, neutron sources and free-electron lasers (see “Global view” figure). Five large-scale (> 100 cavities) SRF projects are currently under construction in three regions: ESS in Europe, FRIB and LCLS-II in the US, and SHINE (China) and RAON (Korea) in Asia. Close international collaboration will continue to support progress in these and future projects, including SRF thin-film technology relevant for a possible future circular electron–positron collider. Perhaps the next wave of SRF technology will be the maturation of economical small-scale applications with high multiplicity and international standards. As an ultimate huge future SRF project, realising an ILC will indeed require sustained broad international collaboration.

The open and free-exchange model that for 30 years has enabled the TTC to make broad progress in SRF technology is a major contribution to science diplomacy efforts on a worldwide scale. We celebrate the many creative and collaborative efforts that have served the international community well via the TESLA Technology Collaboration.

Spiralling into the femtoscale

Radio-frequency quadrupole

Nuclear physics is as wide-ranging and relevant today as ever before in the century-long history of the subject. Researchers study exotic systems from hydrogen-7 to the heaviest nuclides at the boundaries of the nuclear landscape. By constraining the nuclear equation of state using heavy-ion collisions, they peer inside stars in controlled laboratory tests. By studying weak nuclear processes such as beta decays, they can even probe the Standard Model of particle physics. And this is not to mention numerous applications in accelerator-based atomic and condensed-matter physics, radiobiology and industry. These nuclear-physics research areas are just a selection of the diverse work done at the Grand Accélérateur National d’Ions Lourds (GANIL), in Caen, France.

GANIL has been operating since 1983, initially using four cyclotrons, with a fifth Cyclotron pour Ions de Moyenne Energie (CIME) added in 2001. The latter is used to reaccelerate short-lived nuclei produced using beams from the other cyclotrons – the Système de Production d’Ions Radioactifs en Ligne (SPIRAL1) facility. The various beams produced by these cyclotrons drive eight beams with specialised instrumentation. Parallel operation allows the running of three experiments simultaneously, thereby optimising the available beam time. These facilities enable both high-intensity stable-ion beams, from carbon-12 to uranium-238, and lower intensity radioactive-ion beams of short-lived nuclei, with lifetimes from microseconds to milliseconds, such as helium-6, helium-8, silicon-42 and nickel-68. Coupled with advanced detectors, all these beams allow nuclei to be explored in terms of excitation energy, angular momentum and isospin.

The new SPIRAL2 facility, which is currently being commissioned, will take this work into the next decade and beyond. The most recent step forward is the beam commissioning of a new superconducting linac – a major upgrade to the existing infrastructure. Its maximum beam intensity of 5 mA, or 3 × 1016 particles per second, is more than two orders of magnitude higher than at the previous facility. The new beams and state-of-the-art detectors will allow physicists to explore phenomena at the femtoscale right up to the astrophysical scale.

Landmark facility

SPIRAL2 was approved in 2005. It now joins a roster of cutting-edge European nuclear-physics-research facilities which also features the Facility for Antiproton and Ion Research (FAIR), in Darmstadt, Germany, ISOLDE and nTOF at CERN, and the Joint Institute for Nuclear Research (JINR) in Russia. Due to their importance in the European nuclear-physics roadmap, SPIRAL2 and FAIR are both now recognised as European Strategy Forum on Research Infrastructures (ESFRI) Landmark projects, alongside 11 other facilities, including accelerator complexes such as the European X-Ray Free-Electron Laser, and telescopes such as the Square Kilometre Array.

Construction began in 2011. The project was planned in two phases: the construction of a linac for very-high-intensity stable beams, and the associated experimental halls (see “High intensity” figure); and infrastructure for the reacceleration of short-lived fission fragments, produced using deuteron beams on a uranium target through one of the GANIL cyclotrons. Though the second phase is currently on hold, SPIRAL2’s new superconducting linac is now in a first phase of commissioning.

Superconducting linac and experimental halls

Most linacs are optimised for a beam with specific characteristics, which is supplied time and again by an injector. The particle species, velocity profile of the particles being accelerated and beam intensity all tend to be fixed. By tuning the phase of the electric fields in the accelerating structures, charged particles surf on the radio-frequency waves in the cavities with optimal efficiency in a single pass. Though this is the case for most large projects, such as Linac4 at CERN, the Spallation Neutron Source (SNS) in the US and the European Spallation Source in Sweden, SPIRAL2’s linac (see “Multitasking” figure) has been designed for a wide range of ions, energies and intensities.

The multifaceted physics criteria called for an original design featuring a compact multi-cryostat structure for the superconducting cavities, which was developed in collaboration with fellow French national organisations CEA and CNRS. Though the 19 cryomodules are comparable in number to the 23 employed by the larger and more powerful SNS accelerator, the new SPIRAL2 linac has far fewer accelerating gaps. On the other hand, compared to normal-conducting cavities such as those used by Linac4, the power consumption of the superconducting structures at SPIRAL2 is significantly lower, and the linac conforms to additional constraints on the cryostat’s design, operation and cleanliness. The choice of superconducting rather than room-temperature cavities is ultimately linked not only to the need for higher beam intensities and energies, but also to the potential for the larger apertures needed to reduce beam losses.

SPIRAL2 joins a roster of cutting-edge European nuclear-physics-research facilities

Beams are produced using two specialised ion sources. At 200 kW in continuous-wave (CW) mode, the beam power is high enough to make a hole in the vacuum chamber in less than 35 µs, placing additional severe restrictions on the beam dynamics. The operation of high beam intensities, up to 5 mA, also causes space-charge effects that need to be controlled to avoid a beam halo which could activate accelerator components and generate neutrons – a greater difficulty in the case of deuteron beams.

For human safety and ease of technical maintenance, beam losses need to be kept below 1 W/m. Here, the SPIRAL2 design has synergies with several other high-power accelerators, leading to improvements in the design of quarter-wave resonator cavities. These are used at heavy-ion accelerators such as the Facility for Rare Isotope Beams in the US and the Rare Isotope Science Project in Korea; for producing radioactive-ion beams and improving beam dynamics at intense-light particle accelerators worldwide; for producing neutrons at the International Fusion Materials Irradiation Facility, the ESS, the Myrrha Multi-purpose Hybrid Research Reactor for High-tech Applications, and the SNS; and for a large range of studies relating to materials properties and the generation of nuclear power.

Beam commissioning

Initial commissioning of the linac began by sending beams from the injector to a dedicated system with various diagnostic elements. The injector was successfully commissioned with a range of CW beams, including a 5 mA proton beam, a 2 mA alpha-particle beam, a 0.8 mA oxygen–ion beam and a 25 µA argon–ion beam. In each case, almost 100% transmission was achieved through the radio-frequency quadrupoles. Components of the linac were installed, the cryomodules cooled to liquid-helium temperatures (4.5 K), and the mechanical stability required to operate the 26 superconducting cavities at their design specifications demonstrated.

Superconducting cryomodules

As GANIL is a nuclear installation, the injection of beams into the linac required permission from the French nuclear-safety authority. Following a rigorous six-year authorisation process, commissioning through the linac began in July 2019. An additional prerequisite was that a large number of safety systems be validated and put into operation. The key commissioning step completed so far is the demonstration of the cavity performance at 8 MV/m – a competitive electric field well above the required 6.5 MV/m. The first beam was injected into the linac in late October 2019. The cavities were tuned and a low-intensity 200 µA beam of protons accelerated to the design value of 33 MeV and sent to a first test experiment in the neutrons for science (NFS) area. A team from the Nuclear Physics Institute in Prague irradiated copper and iron targets and the products formed in the reaction were transported by a fast-automatic system 40 m away, where their characteristic γ-decay was measured. Precise measurements of such cross-sections are important in order to benchmark safety codes required for the operation of nuclear reactors.

SPIRAL2 is now moving towards its design power by gradually increasing the proton beam current and subsequently the duty cycle of the beam – the ratio of pulse duration to the period of the waveform. A similar procedure with alpha particles and deuteron beams will then follow. Physics programmes will begin in autumn next year.

Future physics

With the new superconducting linac, SPIRAL2 will provide intense beams from protons to nickel – up to 14.5 MeV/A for heavy ions – and continuous and quasi-mono energetic beams of neutrons up to 40 MeV. With state-of-the-art instrumentation such as the Super Separator Spectrometer (S3), the charged particle beams will allow the study of very rare events in the intense background of the unreacted beam with a signal to background fraction of 1 in 1013. The charged particle beams will also characterise exotic nuclei with properties very different from those found in nature. This will address questions related to heavy and super-heavy element/isotope synthesis at the extreme boundaries of the periodic table, and the properties of nuclei such as tin-100, which have the same number of neutrons and protons – a far cry from naturally existing isotopes such as tin-112 and tin-124. Here, ground-state properties such as the mass of nuclei must be measured with a precision of one part in 109 – a level of precision equivalent to observing the addition of a pea to the weight of an Airbus A380. SPIRAL2’s low-energy experimental hall for the disintegration, excitation and storage of radioactive ions (DESIR), which is currently under construction, will further facilitate detailed studies of the ground-state properties of exotic nuclei fed both by S3 and SPIRAL1, the existing upgraded reaccelerated exotic-beams facility. The commissioning of S3 is expected in 2023 and experiments in DESIR in 2025. In parallel, a continuous improvement in the SPIRAL2 facility will begin with the integration of a new injector to substantially increase the intensity of heavy-ion beams.

Properties must be measured with a level of precision equivalent to observing the addition of a pea to the weight of an Airbus A380

Thanks to its very high neutron flux – up to two orders of magnitude higher, in the energy range between 1 and 40 MeV, than at facilities like LANSCE at Los Alamos, nTOF at CERN and GELINA in Belgium – SPIRAL2 is also well suited for applications such as the transmutation of nuclear waste in accelerator-driven systems, the design of present and next-generation nuclear reactors, and the effect of neutrons on materials and biological systems. Light-ion beams from the linac, including alpha particles and lithium-6 and lithium-7 impinging on lead and bismuth targets, will also be used to investigate more efficient methods for the production of certain radioisotopes for cancer therapy.

Developments at SPIRAL2 are quickly moving forwards. In September, the control of the full emittance and space–charge effects was demonstrated – a crucial step to reach the design performance of the linac – and a first neutron beam was produced at NFS, using proton beams. The future looks bright. With the new SPIRAL2 superconducting linac now supplementing the existing cyclotrons, GANIL provides an intensity and variety of beams that is unmatched in a single laboratory, making it a uniquely multi-disciplinary facility in the world today.

Electron makeover proposed for the SPS

eSPS

CERN’s Super Proton Synchrotron (SPS) could be upgraded so that not only protons have the possibility to be accelerated, but also electrons. A 173-page conceptual design report posted on arXiv on 15 September describes the installation of a high-energy electron accelerator that could have the potential to be used for accelerator R&D, dark-sector physics, and for electro-nuclear measurements crucial for future neutrino experiments. The “eSPS”, proposed in 2018 by Torsten Åkesson of Lund University and colleagues at CERN, would marry technology developed for the Compact Linear Collider (CLIC) and the Future Circular Collider (FCC), and could also provide a step towards a potential electron-positron Higgs factory. The facility could be made operational in about five years and would operate in parallel and without interference with the next run of the LHC, Run 4, write the authors.

The SPS is one of CERN’s longest running accelerators, commissioned in June 1976 at an energy of 400 GeV and serving numerous fixed-target experiments ever since. It was later converted into a proton-antiproton collider which was used to discover the W and Z bosons in 1983. Then, in addition to its fixed-target programme, the SPS became part of the injection chain for LEP, and most recently, has been used to accelerate protons for the LHC.

The changeover time for using the SPS as a proton accelerator to an electron accelerator is estimated to be around ten minutes

Electrons would be injected into the SPS at an energy of 3.5 GeV by a new compact high-gradient linac based on CLIC’s X-band radio-frequency (RF) cavity technology, which would fill the circular machine with 200 ns-duration pulses at a rate of 100 Hz. An additional 800 MHz superconducting RF system, similar to what is needed for FCC-ee, would then accelerate the electron beam from 3.5 GeV to an extraction energy up to 18 GeV. The changeover time for using the SPS as a proton accelerator to an electron accelerator is estimated to be around ten minutes.

Serving experiments

The requirements of the primary electron beam to be delivered by the eSPS were determined by the needs of the proposed Light Dark Matter eXperiment (LDMX), which would use missing-momentum techniques to explore potential couplings between hidden-sector particles and electrons in uncharted regions. The experiment could be housed in a new experimental area (see figure). The beam directly from the linac could also serve two experimental areas for a broad range of accelerator R&D; for example, it could provide multi-GeV drive beam bunches and electron witness bunches for plasma wakefield acceleration.

In a second phase, the facility could be geared to deliver positron witness bunches, which would make it a “complete facility” for plasma wakefield collider studies. Such a programme would naturally build on the work done by the AWAKE collaboration, which uses protons as a drive beam, and significantly broaden plasma wakefield R&D at CERN in line with priorities set out by the recent update of the European strategy for particle physics. Positron production would be a crucial element for any future Higgs-factory, while it would also allow studies of the Low EMittance Muon Accelerator (LEMMA) – a novel scheme for obtaining a low-emittance muon beam for a muon collider, by colliding a high-energy positron beam with electrons in a fixed target configuration at the centre of mass energy required to create muon pairs.

The eSPS proposal came about as a result of work in CERN’s Physics Beyond Colliders study group, and an Expression of Interest that was submitted to the SPS Committee in September 2018.

CERN and quantum technologies

AEGIS experiment

Quantum technologies, which exploit inherent phenomena of quantum mechanics such as superposition and entanglement, have the potential to transform science and society over the next five to 10 years. This is sometimes described as the second quantum revolution, following the first that included the introduction of devices such as lasers and transistors over the past half century. Quantum technologies (QTs) require resources that are not mainstream today. During the past couple of years, dedicated support for R&D in QTs has become part of national and international research agendas, with several major initiatives underway worldwide. The time had come for CERN to engage more formally with such activities.

Following a first workshop on quantum computing in high-energy physics organised by CERN openlab in November 2018, best-effort initiatives, events and joint pilot projects have been set up at CERN to explore the interest of the community in quantum technologies (in particular quantum computing), as well as possible synergies with other research fields. In June, CERN management announced the CERN quantum technology initiative. CERN is in the unique position of having in one place the diverse set of skills and technologies – including software, computing and data science, theory, sensors, cryogenics, electronics and material science – necessary for a multidisciplinary endeavour like QT. CERN also has compelling use cases that create ideal conditions to compare classic and quantum approaches to certain applications, and has a rich network of academic and industry relations working in unique collaborations such as CERN openlab.

Alberto Di Meglio

Today, QT is organised into four main domains. One is computing, where quantum phenomena such as superposition are used to speed up certain classes of computational problems beyond the limits achievable with classical systems. A second is quantum sensing and metrology, which exploits the high sensitivity of coherent quantum systems to design new classes of precision detectors and measurement devices. The third, quantum communication, whereby single or entangled photons and their quantum states are used to implement secure communication protocols across fibre-optic networks, or quantum memory devices able to store quantum states. The fourth domain is quantum theory, simulation and information processing, where well-controlled quantum systems are used to simulate or reproduce the behaviour of different, less accessible, many-body quantum phenomena, and relations between quantum phenomena and gravitation can be explored – a topic at the heart of CERN’s theoretical research programme. There is much overlap between these four domains, for example quantum sensors and networks can be brought together to create potentially very precise, large-scale detector systems.

Over the next three years, the quantum technology initiative will assess the potential impact of QTs on CERN and high-energy physics on the timescale of the HL-LHC and beyond. After establishing governance and operational instruments, the initiative will work to define concrete R&D objectives in the four main QT areas by the end of this year. It will also develop an international education and training programme in collaboration with leading experts, universities and industry, and identify mechanisms for knowledge sharing within the CERN Member States, the high-energy physics community, other scientific research communities and society at large. Graduate students will be selected in time for the first projects to begin in early 2021.

Joint initiatives

A number of joint collaborations are already being created across the high-energy physics community and CERN is involved in several pilot investigation projects with leading academic and research centres. On the industry side, through CERN openlab, CERN is already collaborating on quantum-related technologies with CQC, Google, IBM and Intel. The CERN quantum technology initiative will continue to forge links with industry and collaborate with the main national quantum initiatives worldwide.

Quantum technologies have the potential to transform science and society over the next five to 10 years

By taking part in this rapidly growing field, CERN not only has much to offer, but also stands to benefit directly from it. For example, QTs have strong potential in supporting the design of new sophisticated types of detectors, or in tackling the computing workloads of the physics experiments more efficiently. The CERN quantum technology initiative, by helping structure and coordinate activities with our community and the many international public and private initiatives, is a vital step to prepare for this exciting future.

ESS under construction

Aerial view of the ESS

Just a few years after the discovery of the neutron by James Chadwick in 1932, investigations into the properties of neutrons by Fermi and others revealed the strong energy dependence of the neutron’s interactions with matter. This knowledge enabled the development of sustainable neutron production by fission, opening the era of atomic energy. The first nuclear-fission reactors in the 1940s were also equipped with the capacity for materials irradiation, and some provided low-energy (thermal) neutron beams of sufficient intensity for studies of atomic and molecular structure. Despite the high cost of investment in nuclear-research reactors, neutron science flourished to become a mainstay among large-scale facilities for materials research around the world.

The electrical neutrality of neutrons allows them to probe deep into matter in a non-destructive manner, where they scatter off atomic nuclei to reveal important information about atomic and molecular structure and dynamics. Neutrons also carry a magnetic moment. This property, combined with their absence of electric charge, make neutrons uniquely sensitive to magnetism at an atomic level. On the downside, the absence of electric charge means that neutron-scattering cross sections are much weaker than they are for X-rays and electrons, making neutron flux a limiting factor in the power of this method for scientific research.

ESS site layout

Throughout the 1950s and 1960s, incremental advances in the power of nuclear-research reactors and improvements in moderator design provided increasing fluxes of thermal neutrons. In Europe these developments culminated in the construction of the 57 MW high-flux reactor (HFR) at the Institut Laue-Langevin (ILL) in Grenoble, France, with a compact core containing 9 kg of highly enriched uranium enabling neutron beams with energies from around 50 μeV to 500 meV. When the HFR came into operation in 1972, however, it was clear that nuclear-fission reactors were already approaching their limit in terms of steady-state neutron flux (roughly 1.5 × 1015 neutrons per cm2 per second).

Spallation has long been hailed as the method with the potential to push through to far greater neutron fluxes

In an effort to maintain pace with advances in other methods for materials research, such as synchrotron X-ray facilities and electron microscopy, accelerator-based neutron sources were established in the 1980s in the US (IPNS and LANSCE), Japan (KENS) and the UK (ISIS). Spallation has long been hailed as the method with the potential to push through to far greater neutron fluxes, and hence to provide a basis for continued growth of neutron science. However, after nearly 50 years of operation, and with 10 more modern medium- to high-flux neutron sources (including five spallation sources) in operation around the world, the HFR is still the benchmark source for neutron-beam research. Of the spallation sources, the most powerful (SNS at Oak Ridge National Laboratory in the US and J-PARC in Japan) have now been in operation for more than a decade. SNS has reached its design power of 1.4 MW, and J-PARC is planning for tests at 1 MW. At these power levels the sources are competitive with ILL for leading-edge research. It has long been known that the establishment of a new high-flux spallation neutron facility is needed if European science is to avoid a severe shortage in access to neutron science in the coming years (CERN Courier May/June 2020 p49).

Unprecedented performance

The European Spallation Source (ESS), with a budget of €1.8 billion (2013 figures), is a next-generation high-flux neutron source that is currently entering its final construction phase. Fed by a 5 MW proton linac, and fitted with the most compact neutron moderator and matched neutron transport systems, at full power the brightness of the ESS neutron beams is predicted to exceed the HFR by more than two orders of magnitude.

Target station monolith

The idea for the ESS was advanced in the early 1990s. The decision in 2009 to locate it in Lund, Sweden, led to the establishment of an organisation to build and operate the facility (ESS AB) in 2010. Ground-breaking took place in 2014, and today construction is in full swing, with first science expected in 2023 and full user operation in 2026. The ESS is organised as a European Research Infrastructure Consortium (ERIC) and at present has 13 member states: Czech Republic, Denmark, Estonia, France, Germany, Hungary, Italy, Norway, Poland, Spain, Sweden, Switzerland and the UK. Sweden and Denmark are the host countries, providing nearly half of the budget for the construction phase. Around 70% of the funding from the non-host countries is in the form of in-kind contributions, meaning that the countries are delivering components, personnel or other support services to the facility rather than cash.

The unprecedented brightness of ESS neutrons will enable smaller samples, faster measurements and more complex experiments than what is possible at existing neutron sources. This will inevitably lead to discoveries across a wide range of scientific disciplines, from condensed-matter physics, solid-state chemistry and materials sciences, to life sciences, medicine and cultural heritage. A wide range of industrial applications in polymer science and engineering are also anticipated, while new avenues in fundamental physics will be opened (see “Fundamental physics at the ESS” panel).

Fundamental physics at the ESS

The ESS will offer a multitude of opportunities for fundamental physics with neutrons, neutrinos and potentially other secondary particles from additional target stations. While neutron brightness and pulse time structure are key parameters for neutron scattering (the main focus of ESS experiments), the total intensity is more important for many fundamental-physics experiments.

A cold neutron-beam facility for particle physics called ANNI is proposed to allow precision measurements of the beta decay, hadronic weak interactions and electromagnetic properties of the neutron. ANNI will improve the accuracy of measurements of neutron beta decay by an order of magnitude. Experiments will probe a broad range of new-physics models at mass scales from 1 to 100 TeV, far beyond the threshold of direct particle production at accelerators, and resolve the tiny effects of hadronic weak interactions, enabling quantitative tests of the non-perturbative limit of quantum chromodynamics.

Another collaboration is proposing a two-stage experiment at the ESS to search for baryon-number violation. The first stage, HIBEAM, will look for evidence for sterile neutrinos. As a second stage, NNBAR could be installed at the large beam port, with the purpose to search for oscillations between neutrons and anti-neutrons. Observing such a transition would show that the baryon number is violated by two units and that matter containing neutrons is unstable, potentially shedding light on the observed baryon asymmetry of the universe.

A design study, financed through the European Commission’s Horizon 2020 programme, is also under way for the ESS Neutrino Super Beam (ESSνSB) project. This ambitious project would see an accumulator ring and a separate neutrino target added to the ESS facility, with the aim of sending neutrinos to a large underground detector in mid-Sweden, 400–500 km from the ESS. Here, the neutrinos would be detected at their second oscillation maximum, giving the highest sensitivity for discovery and/or measurement of the leptonic CP-violating phase. An accumulator ring and the resulting short proton pulses needed by ESSνSB would open up for other kinds of fundamental physics as well as for new perspectives in neutron scattering, and muon storage rings.

Finally, a proposal has been submitted to ESS concerning coherent neutrino–nucleus scattering (CEνNS). The high proton beam power together with the 2 GeV proton energy will provide a 10 times higher neutrino flux from the spallation target than previously obtained for CEνNS. Measured for the first time by the COHERENT collaboration in 2017 at ORNL’s Spallation Neutron Source, CEνNS offers a new way to probe the properties of the neutrino including searches for sterile neutrinos and a neutrino magnetic moment, and could help reduce the mass of neutrino detectors.

From the start, the ESS has been driven by the neutron-scattering community, with strong involvement from all the leading neutron-science facilities around Europe. To maximise its scientific potential, a reference set of 22 instrument concepts was developed from which 15 instruments covering a wide range of applications were selected for construction. The suite includes three diffractometers for hard-matter structure determination, a diffractometer for macromolecular crystallography, two small-angle scattering instruments for the study of large-scale structures, two reflectometers for the study of surfaces and interfaces, five spectrometers for the study of atomic and molecular dynamics over an energy range from a few μeV to several hundred meV, a diffractometer for engineering studies and a neutron imaging station (see “ESS layout” figure). Given that the ESS target system has the capacity for two neutron moderators and that the beam extraction system allows viewing of each moderator by up to 42 beam ports, there is the potential for many more neutron instruments without major investment in the basic infrastructure. The ESS source also has a unique time structure, with far longer pulses than existing pulsed sources, and an innovative bi-spectral neutron moderator, which allows a high degree of flexibility in the choice of neutron energy.

Accelerator and target

Most of the existing spallation neutron sources use a linear accelerator to accelerate protons to high energies. The particles are stored in an accumulator ring and are then extracted in a short pulse (typically a few microseconds in length) to a heavy-metal spallation target such as tungsten or mercury, which have a high neutron yield. A notable exception is SINQ at PSI, which uses a cyclotron that produces a continuous beam.

A section of the cryogenic system

ESS has a linear accelerator but no accumulator ring, and it will thus have far longer proton pulses of 2.86 ms. This characteristic, combined with the 14 Hz repetition rate of the ESS accelerator, is a key advantage of the ESS for studies of condensed matter, because it allows good energy resolution and broad dynamic range. The result is a source with unprecedented flexibility to be optimised for studies from condensed-matter physics and solid-state chemistry, to polymers and the biological sciences with applications to medical research, industrial materials and cultural heritage. The ESS concept is also of major benefit for experiments in fundamental physics, where the total integrated flux is a main figure of merit.

The high neutron flux at ESS is possible because it will be driven by the world’s most powerful particle accelerator, in terms of MW of beam on target. It will have a proton beam of 62.5 mA accelerated to 2 GeV, with most of the energy gain coming from superconducting radio-frequency cavities cooled to 2 K. Together with its long pulse structure, this gives 5 MW average power and 125 MW of peak power. For proton energies around a few GeV, the neutron production is nearly proportional to the beam power, so the ratio between beam current and beam energy is to a large extent the result of a cost optimisation, while the pulse structure is set by requirements from neutron science.

Linac installation

The neutrons are produced by spallation when the high-energy protons hit the rotating tungsten target. The 2.5 m-diameter target wheel consists of 36 sectors of tungsten blocks inside a stainless-steel disk. It is cooled by helium gas, and it rotates at approximately 0.4 Hz, such that successive beam pulses hit adjacent sectors to allow adequate heat dissipation and limiting radiation damage. The neutrons enter moderator–reflector systems above or below the target wheel. The unique ESS “butterfly” moderator design consists of interpenetrating vessels of water and parahydrogen, and allows viewing of either or both vessels from a 120° wide array of beam ports on either side. The moderator is only 3 cm high, ensuring the highest possible brightness. Thus each instrument is fed by an intense mix of thermal (room temperature) and cold (20 K) neutrons that is optimised to its scientific requirements. The neutrons are transported to the instruments through neutron-reflecting guides that are up to 165 m long. Neutron optics are quite challenging, due to the weak cross-sections, which makes the technology for transporting neutrons sophisticated. The guides consist of optically flat glass or metal channels coated with many thin alternating layers of nickel and titanium, in a sequence designed to enhance the critical angle for reflection. The optical properties of the guides allow for broad spectrum focusing to maximise intensity for varying sample sizes, typically in the range from a few mm3 to several cm3.

Under construction

Construction of the ESS has been growing in intensity since it began in 2014. The infrastructure part was organised differently compared to other scientific large-scale research facilities. A partnering collaboration agreement was set up with the main contractor (Skanska), with separate agreements for the design and target cost settled at the beginning of different stages of the construction to make it a shared interest to build the facility within budget and schedule.

Every year, up to 3000 researchers from all over the world are expected to carry out around 1000 experiments

Today, all the accelerator buildings have been handed over from the contractor to ESS. The ion source, where the protons are produced from hydrogen gas, was delivered from INFN in Catania at the end of 2017. After installation, testing and commissioning to nominal beam parameters, the ion source was inaugurated by the Swedish king and the Italian president in November 2018. Since then, the radio-frequency quadrupole and other accelerator components have been put into position in the accelerator tunnel, and the first prototype cryomodule has been cooled to 2 K. There is intense installation activity in the accelerator, where 5 km of radio-frequency waveguides are being mounted, 6000 welds of cooling-water pipes performed and 25,000 cables being pulled. The target building is under construction, and has reached its full height of 31 m. The large target vacuum vessel is due to arrive from in-kind partner ESS Bilbao in Spain later this year, and the target wheel in early 2021.

The handover of buildings for the neutron instruments started in September 2019, with the hall of the long instruments along with the buildings housing associated laboratories and workshops. While basic infrastructure such as the neutron bunker and radiation shielding for the neutron guides are provided by ESS in Lund, European partner laboratories are heavily involved in the design and construction of the neutron instruments and the sample-environment equipment. ESS has developed its own detector and chopper technologies for the neutron instruments, and these are being deployed for a number of the instruments currently under construction. In parallel, the ESS Data Management and Software Centre, located in Copenhagen, Denmark, is managing the development of instrument control, data management and visualisation and analysis systems. During full operation, the ESS will produce scientific data at a rate of around 10 PB per year, while the complexity of the data-handling requirements for the different instruments and the need for real-time visualisation and processing add additional challenges.

A linac warm unit

The major upcoming milestones for the ESS project are beam-on-target, when first neutrons are produced, and first-science, when the first neutron-scattering experiments take place. According to current schedules, these milestones will be reached in October 2022 and July 2023, respectively. Although beam power at the first-science milestone is expected to be around 100 kW, performance simulations indicate that the quality of results from first experiments will still have a high impact with the user community. The initiation of an open user programme, with three or more of the neutron instruments beginning operation, is expected in 2024, with further instruments becoming available for operation in 2025. When the construction phase ends in late 2025, ESS is expected to be operating at 2 MW, and all 15 neutron instruments will be in operation or ready for hot-commissioning.

The ESS has been funded to provide a service to the scientific community for leading-edge research into materials properties. Every year, up to 3000 researchers from all over the world are expected to carry out around 1000 experiments there. Innovation in the design of the accelerator, the target system and its moderators, and in the key neutron technologies of the neutron instruments (neutron guides, detectors and choppers), ensure that the ESS will establish itself at the vanguard of scientific discovery and development well into the 21st century. Furthermore, provision has been made for the expansion of the ESS to provide a platform for leading-edge research into fundamental physics and as yet unidentified fields of research.

Muon-collider study initiated

A new international design study for a future muon collider began in July, following the recommendations of the 2020 update of the European strategy for particle physics (CERN Courier July/August 2020 p7). Initiated by the Large European Laboratory Directors Group, which exists to maximise co-operation in the planning, preparation and execution of future projects, the study will initially be hosted at CERN, and carried out in collaboration with international partners. Institutes can join by expressing their intent to collaborate via a Memorandum of Understanding. The goal of the study is to evaluate the feasibility of both the accelerator and its physics experiments (CERN Courier May/June 2020 p41). CERN’s Daniel Schulte has been appointed as interim project leader.

bright-rec iop pub iop-science physcis connect