Topics

HL–LHC civil engineering reaches completion

CERN-PHOTO-202210-166-5

After five years of arduous and continuous activity, the main civil-engineering works for the High-Luminosity LHC project (HL–LHC) are on track to be completed by the end of the year. Approved in June 2016 and due to enter operation in 2029, the HL-LHC is a major upgrade that will extend the LHC’s discovery potential significantly. It relies on several innovative and challenging technologies, including new superconducting quadrupole magnets, compact crab cavities to rotate the beams at the collision points, and 80 m-long high-power superconducting links, among many others.

These new LHC accelerator components will be mostly integrated at Point 1 and Point 5 of the ring, where the two general-purpose detectors ATLAS and CMS are located, respectively. As such, the HL-LHC requires new, large civil-engineering structures at each site to house the services, technical infrastructure and accelerator equipment required to power, control and cool the machine’s new long-straight sections.

Connections

At each Point, the underground structures consist of a vertical shaft (80 m deep and 10 m in diameter) leading to a service cavern (16 m in diameter and 46 m long). A power-converter gallery (5.6 m in diameter and 300 m long), two service galleries (3.1 m in diameter and 54 m long), two radio-frequency galleries (5.8 m in diameter and 68 m long), as well as two short safety galleries, complete the underground layout. The connection to the LHC tunnel will be made via 12 vertical cores (1 m in diameter and 7 m deep), which will be drilled later and completed during long-shutdown 3 after the removal of the existing LHC long-straight sections.

The two sites generated 120 jobs on average from 2018 to 2021, solely for companies in charge of civil-engineering construction

Luz Anastasia Lopez-Hernandez

The surface structures consist of five buildings. Three are constructed from reinforced concrete to house noisy equipment such as helium compressors, cooling towers, water pumps, chillers and ventilation units. The other two buildings have steel-frame structures to house electrical distribution cabinets, a helium refrigerator cold-box and the shaft access system. The buildings are interconnected via buried technical galleries.

The HL-LHC civil-engineering project is based on four main contracts. Two consultancy service contracts are dedicated to the design and construction administration: Setectpi-CSD-Rocksoil (ORIGIN) at Point 1 and Lombardi-Artelia-Pini (LAP) at Point 5. Two supply contracts are dedicated to the construction of both the underground and surface structures: Marti Tunnelbau – Marti Österreich – Marti Deutschland (JVMM) at Point 1 and Implenia Schweiz – Baresel – Implenia Construction (CIB) at Point 5.

In total, 92,000 m3 of spoil has been excavated from the underground structures, while 30,000 m3 of concrete and 5000 tonnes of reinforcement-steel were used to construct the underground structures. At Point 5, based on the experience of civil engineering for the CMS shaft, groundwater infiltration was envisaged to make HL-LHC shaft excavation difficult. A different execution methodology and a dry summer in 2018 made the task easier, although the discovery of unexpected hydrocarbon layers (not seen during the CMS works) added some additional difficulties in the management of the polluted spoil. At Point 1, the expected quantity of spoil polluted by hydrocarbon was managed accordingly. The construction of the surface structures, meanwhile, required 6 km of anchor piles, 15,000 m3 of concrete, 1400 tonnes of reinforcement-steel and 700 tonnes of steel frames.

Opportunities

“The two sites generated 120 jobs on average from 2018 to 2021, solely for companies in charge of civil-engineering construction,” says Luz Anastasia Lopez-Hernandez, head of the project-portfolio management group of the site and civil-engineering department.

Special care was taken to limit worksite nuisance with respect to CERN’s neighbours. Truck wheels were systematically washed before leaving the worksites, and temporary buildings were erected on top of the shaft heads to limit the noise impact of the excavation work. The only complaint received during the construction period was related to light pollution at Point 5, after which it was decided to limit worksite lighting during nightfall to the minimum compatible with worker safety. As the excavation of the two shafts started in 2018 in parallel with LHC operation, special care was taken to limit the vibration level by using electrically driven road-header excavators.

The COVID-19 pandemic, which, among other things, required the two worksites to be closed for several weeks in 2020, caused a delay of one-to-two months with respect to the initial construction schedule. The Russian Federation’s invasion of Ukraine also impacted activities this year by delaying some deliveries.

“The next step is to equip these new structures with their technical infrastructures before the next long shutdown, which will be dedicated to the installation of the accelerator equipment,” says Laurent Tavian, work-package leader of the HL–LHC infrastructure, logistics and civil engineering.

Taking plasma accelerators to market

In 1997, physics undergraduate Manuel Hegelich attended a lecture by a visiting professor that would change the course of his career. A new generation of ultra-short-pulse lasers had opened the possibility to accelerate particles to high energies using high-power lasers, a concept first developed in the late 1970s. “It completely captured my passion,” says Hegelich. “I understood the incredible promise for research and industrial advancement if we could make this technology accessible to the masses.” 

Twenty-five years later, Hegelich founded TAU Systems to do just that. In September the US-based firm secured a $15 million investment to build a commercial laser-driven particle accelerator. The target application is X-ray free-electron lasers (XFELs), only a handful of which exist worldwide due to the need for large radio-frequency linacs to accelerate electrons. Laser-driven acceleration could drastically reduce the size and cost of XFELs, says Hegelich, and offers many other applications such as medical imaging. 

Beam time

“As a commercial customer it is difficult to get time on the European XFEL at DESY or the LCLS at SLAC, but these are absolutely fantastic machines that show you biological and chemical interactions that you can’t see in any other way,” he explains. “TAU Systems’ business model is two-pronged: we will offer beam time, data acquisition and analysis as a full-service supplier as well as complete laser-driven accelerators and XFEL systems for sale to, among others, pharma and biotech, battery and solar technology, and other material-science-driven markets.”  

Laser-driven accelerators begin by firing an intense laser pulse at a gas target to excite plasma waves, upon which charged particles can “surf” and gain energy. Researchers worldwide have been pursuing the idea for more than two decades, demonstrating impressive accelerating gradients. CERN’s AWAKE experiment, meanwhile, is exploring the use of proton-driven plasmas that would enable even greater gradients. The challenge is to be able to extract a stable and reliable beam that is useful for applications.

Hegelich began studying the interaction between ultra-intense electromagnetic fields and matter during his PhD at Ludwig Maximilian University in Munich. In 2002 he went to Los Alamos National Laboratory where he ended up leading their laser-acceleration group. A decade later, the University of Texas at Austin invited him to head up a group there. Hegelich has been on unpaid leave of absence since last year to focus on his company, which currently numbers 14 employees and rising. “We have got to a point where we think we can make a product rather than an experiment,” he explains. 

The breakthrough was to inject the gas target with nanoparticles with the right properties at the right time, so as to seed the wakefield sooner and thus enable a larger portion of the wave to be exploited. The resulting electron beam contains so much charge that it drives its own wave, capable of accelerating electrons to 10 GeV over a distance of just 10 cm, explains Hegelich. “The whole community has been chasing 10 GeV for a very long time, because if you ever wanted to build a big collider, or drive an XFEL, you’d need to put together 10 GeV acceleration stages. While gains were theorised, we saw something that was so much more powerful than what we were hoping for. Sometimes it’s better to be lucky than to be good!”

The breakthrough was to inject the gas target with nanoparticles with the right properties at the right time

Hegelich says he was also lucky to attract an investor, German internet entrepreneur Lukasz Gadowski, so soon after he started looking last summer. “This is hardware development: it takes a lot of capital just to get going. Lukasz and I met by accident when I was consulting on a totally different topic. He has invested $15 million and is very interested in the technical side.” 

TAU Systems (the name comes from the symbol used for the laser pulse duration) aims to offer its first products for sale in 2024, have an XFEL service centre operational by 2026 and start selling full XFEL systems by 2027. Improving beam stability will remain the short-term focus, says Hegelich. “At Texas we have a laser system that shoots once per hour or so, with no feedback loop, so sometimes you get a great shot and most of the time you don’t. But we have done some experiments in other regimes with smaller lasers, and other groups have done remarkable work here and shown that it is possible to run for three days straight. Now that we have this company, I can hire actual engineers and programmers – a luxury I simply didn’t have as a university professor.”

He also doesn’t rule out more fundamental applications such as high-energy physics. “I am not going to say that we will replace a collider with a laser, although if things take off and if there is a multibillion-dollar project, then you never know.”

IPAC back in full force

IPAC’22

The 13th International Particle Accelerator Conference (IPAC’22), which took place in Bangkok from 12 to 17 June, marked the return of an in-person event after two years due to the COVID pandemic. Hosted by the Synchrotron Light Research Institute, it was the first time that Thailand has hosted an IPAC conference, with around 800 scientists, engineers, technicians, students and industrial partners from 37 countries in attendance. The atmosphere was understandably electric. Energy and enthusiasm filled the rooms, as delegates had the chance to meet with colleagues and friends from around the world.

The conference began with a blessing from princess Maha Chakri Sirindhorn, who attended the two opening plenary sessions. The scientific programme included excellent invited and contributed talks, as well as outstanding posters, highlighting scientific achievements worldwide. Among them were the precise measurement of the muon’s anomalous magnetic dipole moment (g-2) at Fermi­lab, and the analysis at synchrotron light sources of soil samples obtained from near-Earth asteroid 162173 Ryugu by the Hayabusa2 space mission, which gave a glimpse into the origin of the Solar System.

In total, 88 invited and contributing talks on a wide array of particle accelerator-related topics were presented. These covered updates of new collider projects such as the Electron Ion Collider (EIC), proposed colliders (FCC, ILC and CEPC), as well as upgrade plans for existing facilities such as BEPCII and SuperKEKB, and new photo-source projects such as NanoTerasu and Siam Photon Source II. A talk about the power efficiency of accelerators drew a lot of attention given increasing global concern about sustainability. Accelerator-based radiotherapy continued to be the main topic in the accelerator application category, with a special focus on designing an affordable and low-maintenance linac for deployment in low- and middle-income countries and other challenging environments (CERN Courier January/February 2022 p30). 

Raffaella Geometrante (KYMA) hosted a popular industry session on accelerator technology. Completely revamped from past editions, its aim was to substantially improve the dynamics between laboratories and industry, while also addressing other topics on accelerator innovations and disruptive technologies. 

An engaging outreach talk “Looking into the past with photons” highlighted how synchrotron radiation has become an indispensable tool in archaeological and paleontological research, enabling investigations of the relationship between past civilisations in different corners of the world. A reception held during an evening boat cruise along the Chao Phraya River took participants past majestic palaces and historic temples against a backdrop of traditional Thai music and performances.

IPAC’22 was a successful and memorable conference, seen as a symbol of our return to normal scientific activities and face-to-face interaction. It was also one of the most difficult IPAC conferences to organise – prohibiting or impeding participation from several regions, particularly China and Taiwan, as the world begins to recover from the most prevalent health-related crisis in a century. It was mentioned in the opening session that many breakthroughs in combating the coronavirus pandemic were achieved with the use of particle accelerators: the molecular structure of the virus, which is essential information for subsequent rational drug design, was solved at synchrotron light sources.

A word from FCC Week

FCC Week

More than 500 participants from over 30 countries attended the annual meeting of the Future Circular Collider (FCC) collaboration, which is pursuing a feasibility study for a visionary post-LHC research infrastructure at CERN. Organised as a hybrid event at Sorbonne University in Paris from 30 May to 3 June, the event demonstrated the significant recent progress en route to the completion of the feasibility study in 2025, and the technological and scientific opportunities on offer.

In their welcome talks, Ursula Bassler (CNRS) and Philippe Chomaz (CEA), chair of the FCC collaboration board, stressed France’s long-standing participation in CERN and reaffirmed the support of French physicists and laboratories in the different areas of the FCC project. CERN Director-General Fabiola Gianotti noted that the electron–positron stage, FCC-ee, could begin operations within a few years of the end of the HL-LHC – a crucial step in keeping the community engaged across different generations – while the full FCC programme would offer 100 years of trailblazing physics at both the energy and intensity frontiers. Beyond its outstanding scientific case, FCC requires coordinated R&D in many domains, such as instrumentation and engineering, raising opportunities for young generations to contribute with fresh ideas. These messages echoed those in other opening talks, in particular by Jean-Eric Paquet, director for research and innovation at the European Commission, who highlighted FCC’s role as a world-scale research infrastructure that will allow Europe to maintain its leadership in fundamental research.

A new era

Ten years after the discovery of the Higgs boson, the ATLAS and CMS collaborations continue to establish its properties and interactions with other particles. The discovery of the Higgs boson completes the Standard Model but leaves many questions unanswered; a new era of exploration has opened that requires a blend of large leaps in precision, sensitivity and eventually energy. Theorist Christophe Grojean (DESY) described how the diverse FCC research programme (CERN Courier May/June 2022 p23) offers an extensive set of measurements at the electroweak scale, the widest exploratory potential for new physics, and the potential to address outstanding questions such as the nature of dark matter and the origin of the cosmic matter–antimatter asymmetry.

In recent months, teams from CERN have worked closely with external consultants and CERN’s host states to develop a new FCC layout and placement scenario (CERN Courier May/June 2022 p27). Key elements include the effective use of the European electricity grid, the launch of heat-recovery projects, cooling, agriculture and industrial use – as well as cutting-edge data connections to rural areas. Parallel sessions at FCC Week focused on the design of FCC-ee, which offers a high-luminosity Higgs and electroweak factory. Tor Raubenheimer (SLAC) showed it to be the most efficient lepton collider for energies up to the top-quark mass threshold and highlighted its complementarity to a future FCC-hh. Profiting from the FCC-ee’s high technological readiness, ongoing R&D efforts aim to maximise the efficiency and performance while optimising its environmental impact and operational costs. Many sessions were dedicated to detector development, where the breadth of new results showed that the FCC-ee is much more than a scaled-up version of LEP. It would offer unprecedented precision on Higgs couplings, electroweak and flavour variables, the top-quark mass, and the strong coupling constant, with ample discovery potential for feebly interacting particles. Participants also heard about the FCC-ee’s unique ability in ultra-precise centre-of-mass energy measurements, and the need for new beam-stabilisation and feedback systems.

The FCC programme builds on the large, stable global community that has existed for more than 30 years at CERN and in other laboratories worldwide

High-temperature superconductor (HTS) magnets are among key FCC-ee technologies under consideration for improved energy efficiency, also offering significant potential societal impact. They could be deployed in the FCC-ee final-focus sections, around the positron-production target, and even in the collider arcs. Another major focus is ensuring that the 92 km-circumference machine’s arc cells are effective, reliable and easy to maintain, with a complete arc half-cell mockup planned to be constructed by 2025. The exploration of existing and alternative technologies for FCC-ee is supported by two recently approved projects: the Swiss accelerator R&D programme CHART, and the EU-funded FCCIS design study. The online software requirements for FCC-ee are dominated by an expected physics event rate of ~200 kHz when running at the Z pole. Trigger and data acquisition systems sustaining comparable data rates are already being developed for the HL-LHC, serving as powerful starting points for FCC-ee.

Looking to the future

Finally, participants reviewed ongoing activities toward FCC-hh, an energy- frontier 100 TeV proton–proton collider to follow FCC-ee by exploiting the same infrastructure. FCC-hh studies complement those for FCC-ee, including the organisation of CERN’s high-field magnet R&D programme and the work of the FCC global conductor-development programme. In addition, alternative HTS technologies that could reach higher magnetic fields and higher energies while reducing energy consumption are being explored for FCC’s energy-frontier stage. The challenges of building and operating this new infrastructure and the benefits that can be expected for society and European industry were also discussed during a public event under the auspices of the French Physical Society. 

The FCC programme builds on the large, stable global community that has existed for more than 30 years at CERN and in other laboratories worldwide. The results presented during FCC Week 2022 and ongoing R&D activities will inspire generations of students to learn and grow. Participants from diverse fields and the high number of junior researchers who joined the meeting underline the attractiveness of the project. Robust international participation and long-term commitment to deliver ambitious projects are key for the next steps in the FCC feasibility study.

Tracing molecules at the vacuum frontier

Thermal-radiation calculation

In particle accelerators, large vacuum systems guarantee that the beams travel as freely as possible. Despite being one 25-trillionth the density of Earth’s atmosphere, however, a tiny concentration of gas molecules remain. These pose a problem: their collisions with accelerated particles reduce the beam lifetime and induce instabilities. It is therefore vital, from the early design stage, to plan efficient vacuum systems and predict residual pressure profiles.

Surprisingly, it is almost impossible to find commercial software that can carry out the underlying vacuum calculations. Since the background pressure in accelerators (of the order 10–9–10–12 mbar) is so low, molecules rarely collide with one other and thus the results of codes based on computational fluid dynamics aren’t valid. Although workarounds exist (solving vacuum equations analytically, modelling a vacuum system as an electrical circuit, or taking advantage of similarities between ultra-high-vacuum and thermal radiation), a CERN-developed simulator “Molflow”, for molecular flow, has become the de-facto industry standard for ultra-high-vacuum simulations.

Instead of trying to analytically solve the surprisingly difficult gas behaviour over a large system in one step, Molflow is based on the so-called test-particle Monte Carlo method. In a nutshell: if the geometry is known, a single test particle is created at a gas source and “bounced” through the system until it reaches a pump. Then, repeating this millions of times, with each bounce happening in a random direction, just like in the real world, the program can calculate the hit-density anywhere, from which the pressure is obtained.

The idea for Molflow emerged in 1988 when the author (RK) visited CERN to discuss the design of the Elettra light source with CERN vacuum experts (see “From CERN to Elettra, ESRF, ITER and back” panel). Back then, few people could have foreseen the numerous applications outside particle physics that it would have. Today, Molflow is used in applications ranging from chip manufacturing to the exploration of the Martian surface, with more than 1000 users worldwide and many more downloads from the dedicated website.

Molflow in space 

While at CERN we naturally associate ultra-high vacuum with particle accelerators, there is another domain where operating pressures are extremely low: space. In 2017, after first meeting at a conference, a group from German satellite manufacturer OHB visited the CERN vacuum group, interested to see our chemistry lab and the cleaning process applied to vacuum components. We also demoed Molflow for vacuum simulations. It turned out that they were actively looking for a modelling tool that could simulate specific molecular-contamination transport phenomena for their satellites, since the industrial code they were using had very limited capabilities and was not open-source. 

Molflow has complemented NASA JPL codes to estimate the return flux during a series of planned fly-bys around Jupiter’s moon Europa

A high-quality, clean mirror for a space telescope, for example, must spend up to two weeks encapsulated in the closed fairing from launch until it is deployed in orbit. During this time, without careful prediction and mitigation, certain volatile compounds (such as adhesive used on heating elements) present within the spacecraft can find their way to and become deposited on optical elements, reducing their reflectivity and performance. It is therefore necessary to calculate the probability that molecules migrate from a certain location, through several bounces, and end up on optical components. Whereas this is straightforward when all simulation parameters are static, adding chemical processes and molecule accumulation on surfaces required custom development. Even though Molflow could not handle these processes “out of the box”, the OHB team was able to use it as a basis that could be built on, saving the effort of creating the graphical user interface and the ray-tracing parts from scratch. With the help of CERN’s knowledge-transfer team, a collaboration was established with the Technical University of Munich: a “fork” in the code was created; new physical processes specific to their application were added; and the code was also adapted to run on computer clusters. The work was made publicly available in 2018, when Molflow became open source.

From CERN to Elettra, ESRF, ITER and back

Molflow simulation

Molflow emerged in 1988 during a visit to CERN from its original author (RK), who was working at the Elettra light source in Trieste at the time. CERN vacuum expert Alberto Pace showed him a computer code written in Fortran that enabled the trajectories of particles to be calculated, via a technique called ray tracing. On returning to Trieste, and realising that the CERN code couldn’t be run there due to hardware and software incompatibilities, RK decided to rewrite it from scratch. Three years later the code was formally released. Once more, credit must be given to CERN for having been the birthplace of new ideas for other laboratories to develop their own applications.

Molflow was originally written in Turbo Pascal, had (black and white) graphics, and visualised geometries in 3D – even allowing basic geometry editing and pressure plots. While today such features are found in every simulator, at the time the code stood out and was used in the design of several accelerator facilities, including the Diamond Light Source, Spallation Neutron Source, Elettra, Alba and others – as well as for the analysis of a gas-jet experiment for the PANDA experiment at GSI Darmstadt. That said, the early code had its limitations. For example, the upper limit of user memory (640 kB for MS-DOS) significantly limited the number of polygons used to describe the geometry, and it was single-processor. 

In 2007 the original code was given a new lease of life at the European Synchrotron Radiation Facility in Grenoble, where RK had moved as head of the vacuum group. Ported to C++, multi-processor capability was added, which is particularly suitable for Monte Carlo calculations: if you have eight CPU cores, for example, you can trace eight molecules at the same time. OpenGL (Open Graphics Library) acceleration made the visualisation very fast even for large structures, allowing the usual camera controls of CAD editors to be added. Between 2009 and 2011 Molflow was used at ITER, again following its original author, for the design and analysis of vacuum components for the international tokamak project.

In 2012 the project was resumed at CERN, where RK had arrived the previous year. From here, the focus was on expanding the physics and applications: ray-tracing terms like “hit density” and “capture probability” were replaced with real-world quantities such as pressure and pumping speed. To publish the code within the group, a website was created with downloads, tutorial videos and a user forum. Later that year, a sister code “Synrad” for synchrotron-radiation calculations, also written in Trieste in the 1990s, was ported to the modern environment. The two codes could, for the first time, be used as a package: first, a synchrotron-radiation simulation could determine where light hits a vacuum chamber, then the results could be imported to a subsequent vacuum simulation to trace the gas desorbed from the chamber walls. This is the so-called photon-stimulated desorption effect, which is a major hindrance to many accelerators, including the LHC.

Molflow and Synrad have been downloaded more than 1000 times in the past year alone, and anonymous user metrics hint at around 500 users who launch it at least once per month. The code is used by far the most in China, followed by the US, Germany and Japan. Switzerland, including users at CERN, places only fifth. Since 2018, the roughly 35,000-line code has been available open-source and, although originally written for Windows, it is now available for other operating systems, including the new ARM-based Macs and several versions of Linux.

One year later, the Contamination Control Engineering (CCE) team from NASA’s Jet Propulsion Laboratory (JPL) in California reached out to CERN in the context of its three-stage Mars 2020 mission. The Mars 2020 Perseverance Rover, built to search for signs of ancient microbial life, successfully landed on the Martian surface in February 2021 and has collected and cached samples in sealed tubes. A second mission plans to retrieve the cache canister and launch it into Mars orbit, while a third would locate and capture the orbital sample and return it to Earth. Each spacecraft experiences and contributes to its own contamination environment through thruster operations, material outgassing and other processes. JPL’s CCE team performs the identification, quantification and mitigation of such contaminants, from the concept-generation to the end-of-mission phase. Key to this effort is the computational physics modelling of contaminant transport from materials outgassing, venting, leakage and thruster plume effects.

Contamination consists of two types: molecular (thin-film deposition effects) and particulate (producing obscuration, optical scatter, erosion or mechanical damage). Both can lead to degradation of optical properties and spurious chemical composition measurements. As more sensitive space missions are proposed and built – particularly those that aim to detect life – understanding and controlling outgassing properties requires novel approaches to operating thermal vacuum chambers. 

Just like accelerator components, most spacecraft hardware undergoes long-duration vacuum baking at relatively high temperatures to reduce outgassing. Outgassing rates are verified with quartz crystal microbalances (QCMs), rather than vacuum gauges as used at CERN. These probes measure the resonance frequency of oscillation, which is affected by the accumulation of adsorbed molecules, and are very sensitive: a 1 ng deposition on 1 cm2 of surface de-tunes the resonance frequency by 2 Hz. By performing free-molecular transport simulations in the vacuum-chamber test environment, measurements by the QCMs can be translated to outgassing rates of the sources, which are located some distance from the probes. For these calculations, JPL currently uses both Monte Carlo schemes (via Molflow) and “view factor matrix” calculations (through in-house solvers). During one successful Molflow application (see “Molflow in space” image, top) a vacuum chamber with a heated inner shroud was simulated, and optimisation of the chamber geometry resulted in a factor-40 increase of transmission to the QCMs over the baseline configuration. 

From SPHEREx to LISA

Another JPL project involving free molecular-flow simulations is the future near-infrared space observatory SPHEREx (Spectro-Photometer for the History of the Universe and Ices Explorer). This instrument has cryogenically cooled optical surfaces that may condense molecules in vacuum and are thus prone to significant performance degradation from the accumulation of contaminants, including water. Even when taking as much care as possible during the design and preparation of the systems, some elements, such as water, cannot be entirely removed from a spacecraft and will desorb from materials persistently. It is therefore vital to know where and how much contamination will accumulate. For SPHEREx, water outgassing, molecular transport and adsorption were modelled using Molflow against internal thermal predictions, enabling a decontamination strategy to keep its optics free from performance-degrading accumulation (see “Molflow in space” image, left). Molflow has also complemented other NASA JPL codes to estimate the return flux (whereby gas particles desorbing from a spacecraft return to it after collisions with a planet’s atmosphere) during a series of planned fly-bys around Jupiter’s moon Europa. For such exospheric sampling missions, it is important to distinguish the actual collected sample from return-flux contaminants that originated from the spacecraft but ended up being collected due to atmospheric rebounds.

Vacuum-chamber simulation for NASA

It is the ability to import large, complex geometries (through a triangulated file format called STL, used in 3D printing and supported by most CAD software) that makes Molflow usable for JPL’s molecular transport problems. In fact, the JPL team “boosted” our codes with external post-processing: instead of built-in visualisation, they parsed the output file format to extract pressure data on individual facets (polygons representing a surface cell), and sometimes even changed input parameters programmatically – once again working directly on Molflow’s own file format. They also made a few feature requests, such as adding histograms showing how many times molecules bounce before adsorption, or the total distance or time they travel before being adsorbed on the surfaces. These were straightforward to implement, and because JPL’s scientific interests also matched those of CERN users, such additions are now available for everyone in the public versions of the code. Similar requests have come from experiments employing short-lived radioactive beams, such as those generated at CERN’s ISOLDE beamlines. Last year, against all odds during COVID-related restrictions, the JPL team managed to visit CERN. While showing the team around the site and the chemistry laboratory, they held a seminar for our vacuum group about contamination control at JPL, and we showed the outlook for Molflow developments.

Our latest space-related collaboration, started in 2021, concerns the European Space Agency’s LISA mission, a future gravitational-wave interferometer in space (see CERN Courier September/October 2022 p51). Molflow is being used to analyse data from the recently completed LISA Pathfinder mission, which explored the feasibility of keeping two test masses in gravitational free-fall and using them as inertial sensors by measuring their motion with extreme precision. Because the satellite’s sides have different temperatures, and because the gas sources are asymmetric around the masses, there is a difference in outgassing between two sides. Moreover, the gas molecules that reach the test mass are slightly faster on one side than the other, resulting in a net force and torque acting on the mass, of the order of femtonewtons. When such precise inertial measurements are required, this phenomenon has to be quantified, along with other microscopic forces, such as Brownian noise resulting from the random bounces of molecules on the test mass. To this end, Molflow is currently being modified to add molecular force calculations for LISA, along with relevant physical quantities such as noise and resulting torque.

Sky’s the limit 

High-energy applications

Molflow has proven to be a versatile and effective computational physics model for the characterisation of free-molecular flow, having been adopted for use in space exploration and the aerospace sector. It promises to continue to intertwine different fields of science in unexpected ways. Thanks to the ever-growing gaming industry, which uses ray tracing to render photorealistic scenes of multiple light sources, consumer-grade graphics cards started supporting ray-tracing in 2019. Although intended for gaming, they are programmable for generic purposes, including science applications. Simulating on graphics-processing units is much faster than traditional CPUs, but it is also less precise: in the vacuum world, tiny imprecisions in the geometry can result in “leaks” or some simulated particles crossing internal walls. If this issue can be overcome, the speedup potential is huge. In-house testing carried out recently at CERN by PhD candidate Pascal Bahr demonstrated a speedup factor of up to 300 on entry-level Nvidia graphics cards, for example.

Our latest space-related collaboration concerns the European Space Agency’s LISA mission

Another planned Molflow feature is to include surface processes that change the simulation parameters dynamically. For example, some getter films gradually lose their pumping ability as they saturate with gas molecules. This saturation depends on the pumping speed itself, resulting in two parameters (pumping speed and molecular surface saturation) that depend on each other. The way around this is to perform the simulation in iterative time steps, which is straightforward to add but raises many numerical problems.

Finally, a much-requested feature is automation. The most recent versions of the code already allow scripting, that is, running batch jobs with physics parameters changed step-by-step between each execution. Extending these automation capabilities, and adding export formats that allow easier post-processing with common tools (Matlab, Excel and common Python libraries) would significantly increase usability. If adding GPU ray tracing and iterative simulations are successful, the resulting – much faster and more versatile – Molflow code will remain an important tool to predict and optimise the complex vacuum systems of future colliders.

CERN and Canon demonstrate efficient klystron

E37117 klystron

The radio-frequency (RF) cavities that accelerate charged particles in machines like the LHC are powered by devices called klystrons. These electro-vacuum tubes, which amplify RF signals by converting an initial velocity modulation of a stream of electrons into an intensity modulation, produce RF power in a wide frequency range (from several hundred MHz to tens of GHz) and can be used in pulsed or continuous-wave mode to deliver RF power from hundreds of kW to hundreds of MW. The close connection between klystron performance and the power consumption of an accelerator has driven researchers at CERN to develop more efficient devices for current and future colliders. 

The efficiency of a klystron is calculated as the ratio between generated RF power and the electrical power that is delivered from the grid. Experience with many thousands of such devices during the past seven decades has established that at low frequency and moderate RF power levels (as required by the LHC), klystrons can deliver an efficiency of 60–65%. For pulsed, high-frequency and high peak-RF power devices, efficiencies are about 40–45%. The efficiency of RF power production is a key element of an accelerator’s overall efficiency. Taking the proposed future electron–positron collider FCC-ee as an example: by increasing klystron efficiency from 65 to 80%, the electrical power savings over a 10-year period could be as much as 1 TWhr. In addition, reduced demand on the electrical power storage capacity and cooling and ventilation may further reduce the original investment cost.

In 2013 the development of high-efficiency klystrons started at CERN within the Compact Linear Collider study as a means to reduce the global energy consumed by the proposed collider. Thanks to strong support by management, this evolved into a project inside the CERN RF group. A small team of five people at CERN and Lancaster University, led by Igor Syratchev, developed accurate computer tools for klystron simulations and in-depth analysis of the beam dynamics, and used them to evaluate effects that limit klystron efficiency. Finally, the team proposed novel technological solutions (including new bunching methods and higher order harmonic cavities) that can improve klystron efficiency by 10–30% compared to commercial analogues. These new technologies were applied to develop new high-efficiency klystrons for use in the high-luminosity LHC (HL-LHC), FCC-ee and the CERN X-band high-gradient facilities, as well as in medical and industrial accelerators. Some of the new tube designs are now undergoing prototyping in close collaboration between CERN and industry.

The first commercial prototype of a high-efficiency 8 MW X-band klystron developed at CERN was built and tested by Canon Electron Tubes and Devices in July this year. Delivering an expected power level with an efficiency of 53.3% measured at their factory in Japan, it is the first demonstration of the technological solution developed at CERN that showed an efficiency increase of more than 10% compared to commercially available devices. In terms of RF power production, this translates to an overall increase of 25% using the same wall-plug power as the model currently working at CERN’s X-band facility. Later this year the klystron will arrive at CERN and replace Canon’s conventional 6 MW tube. The next project in progress aims to fabricate a high-efficiency version of the LHC klystron, which, if successful, could be used in the HL-LHC.

“These results give us confidence for the coming high-efficiency version of the LHC klystrons and for the development of FCC-ee,” says RF group leader Frank Gerigk. “It is also an excellent demonstration of the powerful collaboration between CERN and industry.” 

Exploring a laser-hybrid accelerator for radiotherapy

LhARA

A multidisciplinary team in the UK has received seed funding to investigate the feasibility of a new facility for ion-therapy research based on novel accelerator, instrumentation and computing technologies. At the core of the facility would be a laser-hybrid accelerator dubbed LhARA: a high-power pulsed laser striking a thin foil target would create a large flux of protons or ions, which are captured using strong-focusing electron–plasma lenses and then accelerated rapidly in a fixed-field alternating-gradient accelerator. Such a device, says the team, offers enormous clinical potential by providing more flexible, compact and cost-effective multi-ion sources.

High-energy X-rays are by far the most common radiotherapy tool, but recent decades have seen a growth in particle-beam radiotherapy. In contrast to X-rays, protons and ion beams can be manipulated to deliver radiation doses more precisely than conventional radiotherapy, sparing surrounding healthy tissue. Unfortunately, the number of ion treatment facilities is few because they require large synchrotrons to accelerate the ions. The Proton-Ion Medical Machine Study undertaken at CERN during the late 1990s, for example, underpinned the CNAO (Italy) and MedAustron (Austria) treatment centres that helped propel Europe to the forefront of the field – work that is now being continued by CERN’s Next Ion Medical Machine Study (CERN Courier July/August 2021 p23).

“LhARA will greatly accelerate our understanding of how protons and ions interact and are effective in killing cancer cells, while simultaneously giving us experience in running a novel beam,” says LhARA biological science programme manager Jason Parsons of the University of Liverpool. “Together, the technology and the science will help us make a big step forward in optimising radiotherapy treatments for cancer patients.” 

A small number of laboratories in Europe already work on laser-driven sources for biomedical applications. The LhARA collaboration, which comprises physicists, biologists, clinicians and engineers, aims to build on this work to demonstrate the feasibility of capturing and manipulating the flux created in the laser-target interaction to provide a beam that can be accelerated rapidly to the desired energy. The laser-driven source offers the opportunity to capture intense, nanosecond-long pulses of protons and ions at an energy of 15 MeV, says the team. This is two orders of magnitude greater than in conventional sources, allowing the space-charge limit on the instantaneous dose to be evaded. 

In July, UK Research and Innovation granted £2 million over the next two years to deliver a conceptual design report for an Ion Therapy Research Facility (ITRF) centred around LhARA. The first goal is to demonstrate the feasibility of the laser-hybrid approach in a facility dedicated to biological research, after which the team will work with national and international partnerships to develop the clinical technique. While the programme carries significant technical risk, says LhARA co-spokesperson Kenneth Long from Imperial College London/STFC, it is justified by the high level of potential reward: “The multi­disciplinary approach of the LhARA collaboration will place the ITRF at the forefront of the field, partnering with industry to pave the way for significantly enhanced access to state-of-the-art particle-beam therapy.” 

First light beckons at SLAC’s LCLS-II

The LCLS undulator hall

An ambitious upgrade of the US’s flagship X-ray free-electron-laser facility – the Linac Coherent Light Source (LCLS) at SLAC in California – is nearing completion. Set for “first light” early next year, LCLS-II will deliver X-ray laser beams that are 10,000 times brighter than LCLS at repetition rates of up to a million pulses per second – generating more X-ray pulses in just a few hours than the current laser has delivered through the course of its 12-year operational lifetime. The cutting-edge physics of the new facility – underpinned by a cryogenically cooled superconducting radio-frequency (SRF) linac – will enable the two beams from LCLS and LCLS-II to work in tandem. This, in turn, will help researchers observe rare events that happen during chemical reactions and study delicate biological molecules at the atomic scale in their natural environments, as well as potentially shed light on exotic quantum phenomena with applications in next-generation quantum computing and communications systems. 

Successful delivery of the LCLS-II linac was possible thanks to a multi-centre collaborative effort involving US national and university laboratories, following the decision to pursue an SRF-based machine in 2014 through the design, assembly, test, transportation and installation of a string of 37 SRF cryomodules (most of them more than 12 m long) into the SLAC tunnel. All told, this major undertaking necessitated the construction of forty 1.3 GHz SRF cryomodules (five of them spares) and three 3.9 GHz cryomodules (one spare) – with delivery of approximately one cryomodule per month from February 2019 until December 2020 to allow completion of the LCLS-II linac installation on schedule by November 2021. 

This industrial-scale programme of works was shaped by a strategic commitment, early on in the LCLS-II design phase, to transfer, and ultimately iterate, the established SRF capabilities of the European XFEL in Hamburg into the core technology platform used for the LCLS-II SRF cryomodules. Put simply: it would not have been possible to complete the LCLS-II project, within cost and on schedule, without the sustained cooperation of the European XFEL consortium – in particular, colleagues at DESY, CEA Saclay and several other European laboratories as well as KEK – that generously shared their experiences and know-how. 

Better together 

These days, large-scale accelerator or detector projects are very much a collective endeavour. Not only is the sprawling scope of such projects beyond a single organisation, but the risks of overspend and slippage can greatly increase with a “do-it-on-your-own” strategy. When the LCLS-II project opted for an SRF technology pathway in 2014 to maximise laser performance, the logical next step was to build a broad-based coalition with other US Department of Energy (DOE) national laboratories and universities. In this case, SLAC, Fermilab, Jefferson Lab (JLab) and Cornell University contributed expertise for cryomodule production, while Argonne National Laboratory and Lawrence Berkeley National Laboratory managed delivery of the undulators and photoinjector for the project. For sure, the start-up time for LCLS-II would have increased significantly without this joint effort, extending the overall project by several years.

Superconducting accelerator

Each partner brought something unique to the LCLS-II collaboration. While SLAC was still a relative newcomer to SRF technologies, the lab had a management team that was familiar with building large-scale accelerators (following successful delivery of the LCLS). The priority for SLAC was therefore to scale up its small nucleus of SRF experts by recruiting experienced SRF technologists and engineers to the staff team. In contrast, the JLab team brought an established track-record in the production of SRF cryomodules, having built its own machine, the Continuous Electron Beam Accelerator Facility (CEBAF), as well as cryomodules for the Spallation Neutron Source (SNS) linac at Oak Ridge National Laboratory in Tennessee. Cornell, too, came with a rich history in SRF R&D – capabilities that, in turn, helped to solidify the SRF cavity preparation process for LCLS-II. 

Finally, Fermilab had, at the time, recently built two cutting-edge cryomodules of the same style as that chosen for LCLS-II. To fabricate these modules, Fermilab worked closely with the team at DESY to set up the same type of production infrastructure used on the European XFEL. From that perspective, the required tooling and fixtures were all ready to go for the LCLS-II project. While Fermilab was the “designer of record” for the SRF cryomodule, with primary responsibility for delivering a working design to meet LCLS-II requirements, the realisation of an optimised technology platform was a team effort involving SRF experts from across the collaboration.

Collective problems, collective solutions 

While the European XFEL provided the template for the LCLS-II SRF cryomodule design, several key elements of the LCLS-II approach subsequently evolved to align with the continuous-wavelength (CW) operation requirements and the specifics of the SLAC tunnel. Success in tackling these technical challenges – across design, assembly, testing and transportation of the cryomodules – is testament to the strength of the LCLS-II collaboration and the collective efforts of the participating teams in the US and Europe.

Challenges are inevitable when developing new facilities at the limits of known technology

For one, the thermal performance specification of the SRF cavities exceeded the state-of-the-art and required development and industrialisation of the concept of nitrogen doping (a process in which SRF cavities are heat-treated in a nitrogen atmosphere to increase their cryogenic efficiency and, in turn, lower the overall operating costs of the linac). The nitrogen-doping technique was invented at Fermilab in 2012 but, prior to LCLS-II construction, had been used only in an R&D setting.

The priority was clear: to transfer the nitrogen-doping capability to LCLS-II’s industry partners, so that the cavity manufacturers could perform the necessary materials-processing before final helium-vessel jacketing. During this knowledge transfer, it was found that nitrogen-doped cavities are particularly sensitive to the base niobium sheet material – something the collaboration only realised once the cavity vendors were into full production. This resulted in a number of process changes for the heat treatment temperature, depending on which material supplier was used and the specific properties of the niobium sheet deployed in different production runs. JLab, for its part, held the contract for the cavities and pulled out all stops to ensure success.

SRF cryomodules

At the same time, the conversion from pulsed to CW operation necessitated a faster cooldown cycle for the SRF cavities, requiring several changes to the internal piping, a larger exhaust chimney on the helium vessel, as well as the addition of two new cryogenic valves per cryomodule. Also significant is the 0.5% slope in the longitudinal floor of the existing SLAC tunnel, which dictated careful attention to liquid-helium management in the cryomodules (with a separate two-phase line and liquid-level probes at both ends of every module). 

However, the biggest setback during LCLS-II construction involved the loss of beamline vacuum during cryomodule transport. Specifically, two cryomodules had their beamlines vented and required complete disassembly and rebuilding – resulting in a five-month moratorium on shipping of completed cryomodules in the second half of 2019. It turns out that a small (what was thought to be inconsequential) change in a coupler flange resulted in the cold coupler assembly being susceptible to resonances excited by transport. The result was a bellows tear that vented the beamline. Unfortunately, initial “road-tests” with a similar, though not exactly identical, prototype cryomodule had not revealed this behaviour. 

Such challenges are inevitable when developing new facilities at the limits of known technology. In the end, the problem was successfully addressed using the diverse talents of the collaboration to brainstorm solutions, with the available access ports allowing an elastomer wedge to be inserted to secure the vulnerable section. A key take-away here is the need for future projects to perform thorough transport analysis, verify the transport loads using mock-ups or dummy devices, and install adequate instrumentation to ensure granular data analysis before long-distance transport of mission-critical components. 

The last cryomodule from Fermilab

Upon completion of the assembly phase, all LCLS-II cryo­modules were subsequently tested at either Fermilab or JLab, with one module tested at both locations to ensure reproducibility and consistency of results. For high Q0 performance in nitrogen-doped cavities, cooldown flow rates of at least 30 g/s of liquid helium were found to give the best results, helping to expel magnetic flux that could otherwise be trapped in the cavity. Overall, cryomodule performance on the test stands exceeded specifications, with a total accelerating voltage per cryomodule of 158 MV (versus specification of 128 MV) and average Q0 of 3 × 1010 (versus specification of 2.7 × 1010). Looking ahead, attention is already shifting to the real-world cryomodule performance in the SLAC tunnel – something that was measured for the first time in 2022.

Transferable lessons

For all members of the collaboration working on the LCLS-II cryomodules, this challenging project holds many lessons. Most important is to build a strong team and use that strength to address problems in real-time as they arise. The mantra “we are all in this together” should be front-and-centre for any multi-institutional scientific endeavour – as it was in this case. Solutions need to be thought of in a more global sense, as the best answer might mean another collaborator taking more onto their plate. Collaboration implies true partnership and a working model very different to a transactional customer–vendor relationship.

From a planning perspective, it’s vital to ensure that the initial project cost and schedule are consistent with the technical challenges and preparedness of the infrastructure. Prototypes and pre-series production runs reduce risk and cost in the long term and should be part of the plan, but there must be sufficient time for data analysis and changes to be made after a prototype run in order for it to be useful. Time spent on detailed technical reviews is also time well spent. New designs of complex components need a comprehensive oversight and review, and should be controlled by a team, rather than a single individual, so that sign-off on any detailed design changes are made by an informed collective. 

LCLS-II science: capturing atoms and molecules in motion like never before

LCLS-II science

The strobe-like pulses of the LCLS, which produced its first light in April 2009, are just a few millionths of a billionth of a second long, and a billion times brighter than previous X-ray sources. This enables users from a wide range of fields to take crisp pictures of atomic motions, watch chemical reactions unfold, probe the properties of materials and explore fundamental processes in living things. LCLS-II will provide a major jump in capability – moving from 120 pulses per second to 1 million, enabling experiments that were previously impossible. The scientific community has identified six areas where the unique capabilities of LCLS-II will be essential for further scientific progress:

Nanoscale materials dynamics, heterogeneity and fluctuations 

Programmable trains of soft X-ray pulses at high rep rate will characterise spontaneous fluctuations and heterogeneities at the nanoscale across many decades, while coherent hard X-ray scattering will provide unprecedented spatial resolution of material structure, its evolution and relationship to functionality under operating conditions.

Fundamental energy and charge dynamics

High-repetition-rate soft X-rays will enable new techniques that will directly map charge distributions and reaction dynamics at the scale of molecules, while new nonlinear X-ray spectroscopies offer the potential to map quantum coherences in an element-specific way for the first time.

Catalysis and photocatalysis

Time-resolved, high-sensitivity, element- specific spectroscopy will provide the first direct view of charge dynamics and chemical processes at interfaces, characterise subtle conformational changes associated with charge accumulation, and capture rare chemical events in operating catalytic systems across multiple time and length scales – all of which are essential for designing new, more efficient systems for chemical transformation and solar-energy conversion.

Emergent phenomena in quantum materials

Fully coherent X-rays will enable new high- resolution spectroscopy techniques to map the collective excitations that define these new materials in unprecedented detail. Ultrashort X-ray pulses and optical fields will facilitate new methods for manipulating charge, spin and phonon modes to both advance fundamental understanding and point the way to new approaches for materials control.

Revealing biological function in real time

The high repetition rate of LCLS-II will provide a unique capability to follow the dynamics of macromolecules and interacting complexes in real time and in native environments. Advanced solution-scattering and coherent imaging techniques will characterise the conformational dynamics of heterogeneous ensembles of macromolecules, while the ability to generate “two-colour” hard X-ray pulses will resolve atomic-scale structural dynamics of biochemical processes that are often the first step leading to larger-scale protein motions.

Matter in extreme environments

The capability of LCLS-II to generate soft and hard X-ray pulses simultaneously will enable the creation and observation of extreme conditions that are far beyond our present reach, with the latter allowing the characterisation of unknown structural phases. Unprecedented spatial and temporal resolution will enable direct comparison with theoretical models relevant for inertial-confinement fusion and planetary science.

Work planning and control is another essential element for success and safety. This idea needs to be built into the “manufacturing system”, including into the cost and schedule, and to be part of each individual’s daily checklist. No one disagrees with this concept, but good intentions on their own will not suffice. As such, required safety documentation should be clear and unambiguous, and be reviewed by people with relevant expertise. Production data and documentation need to be collected, made easily available to the entire project team, and analysed regularly for trends, both positive and negative. 

Supply chain, of course, is critical in any production environment – and LCLS-II is no exception. When possible, it is best to have parts procured, inspected, accepted and on-the-shelf before production begins, thereby eliminating possible workflow delays. Pre-stocking also allows adequate time to recycle and replace parts that do not meet project specifications. Also worth noting is that it’s often the smaller components – such as bellows, feedthroughs and copper-plated elements – that drive workflow slowdowns. A key insight from LCLS-II is to place purchase orders early, stay on top of vendor deliveries, and perform parts inspections as soon as possible post-delivery. Projects also benefit from having clearly articulated pass/fail criteria and established procedures for handling non-conformance – all of which alleviates the need to make critical go/no-go acceptance decisions in the face of schedule pressures.

As with many accelerator projects, LCLS-II is not an end-point in itself, more an evolutionary transition within a longer term roadmap

Finally, it’s worth highlighting the broader impact – both personal and professional – on individual team members participating in a big-science collaboration like LCLS-II. At the end of the build, what remained after designs were completed, problems solved, production rates met, and cryomodules delivered and installed, were the friendships that had been nurtured over several years. The collaboration amongst partners, both formal and informal, who truly cared about the project’s success, and had each other’s backs when there were issues arising: these are the things that solidified the mutual respect, the camaraderie and, in the end, made LCLS-II such a rewarding project.

First light

In April 2022 the new LCLS-II linac was successfully cooled to its 2 K operating temperature. The next step was to pump the SRF cavities with more than a megawatt of microwave power to accelerate the electron beam from the new source. Following further commissioning of the machine, first X-rays are expected to be produced in early 2023. 

As with many accelerator projects, LCLS-II is not an end-point in itself, more an evolutionary transition within a longer term roadmap. In fact, work is already under way on LCLS-II HE – a project that will increase the energy of the CW SRF linac from 4 to 8 GeV, enabling the photon energy range to be extended to at least 13 keV, and potentially up to 20 keV at 1 MHz repetition rates. To ensure continuity of production for LCLS-II HE, 25 next-generation cryomodules are in the works, with even higher performance specifications versus their LCLS-II counterparts, while upgrades to the source and beam transport are also being finalised. 

While the fascinating science opportunities for LCLS-II-HE continue to be refined and expanded, of one thing we can be certain: strong collaboration and the collective efforts of the participating teams are crucial. 

SESAME revives the ancient Near East

The IR microscope at SESAME

The Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) is a 2.5 GeV third-generation synchrotron radiation (SR) source developed under the auspices of UNESCO and modelled after CERN. Located in Allan, Jordan, it aims to foster scientific and technological excellence as well as international cooperation amongst its members, which are currently Cyprus, Egypt, Iran, Israel, Jordan, Pakistan, Palestine and Turkey. As a user facility, SESAME hosts visiting scientists from a wide range of disciplines, allowing them to access advanced SR techniques that link the functions and properties of samples and materials to their micro, nano and atomic structure.

The location of SESAME is known for its richness in archaeological and cultural heritage. Many important museums, collections, research institutions and universities host departments dedicated to the study of materials and tools that are inextricably linked to prehistory and human history, demanding interdisciplinary research agendas and teams. As materials science and condensed-matter physics play an increasing role in understanding and reconstructing the properties of artefacts, SESAME offers a highly versatile tool for the researchers, conservators and cultural-heritage specialists in the region.

The high photon flux, small source size and low divergence available at SR sources allow for advanced spectroscopy and imaging techniques that are well suited for studying ancient and historical materials, and which often present very complex and heterogeneous structures. SR techniques are non-destructive, and the existence of several beamlines at SR facilities means that samples can easily be transferred and reanalysed using complementary techniques.

SESAME offers a versatile tool for researchers, conservators and cultural-heritage specialists in the region

At SESAME, an infrared microspectroscopy beamline, an X-ray fluorescence and absorption spectroscopy beamline, and a powder diffraction beamline are available, while a soft X-ray beamline called “HESEB” has been designed and constructed by five Helmholtz research centres and is now being commissioned. Next year, the BEAmline for Tomography at SESAME (BEATS) will also be completed, with the construction and commissioning of a beamline for hard X-ray full-field tomography. BEATS involves the INFN, The Cyprus Institute and the European SR facilities ALBA-CELLS (Spain), DESY (Germany), ESRF (France), Elettra (Italy), PSI (Switzerland) and SOLARIS (Poland).

To explore the potential of these beamlines, the First SESAME Cultural Heritage Day took place online on 16 February with more than 240 registrants in 39 countries. After a welcome by SESAME director Khaled Toukan and president of council Rolf Heuer, Mohamed ElMorsi (Conservation Centre, National Museum of Egyptian Civilization), Marine Cotte (ESRF) and Andrea Lausi (SESAME) presented overviews of ancient Egyptian cultural heritage, heritage studies at the ESRF, and the experimental capabilities of SESAME, respectively. This was followed by several research insights obtained by studies at SESAME and other SR facilities: Maram Na’es (TU Berlin) showed the reconstruction of colour in Petra paintings; Heinz-Eberhard Mahnke and Verena Lepper (Egyptian Museum and Papyrus Collection, FU/HU Berlin and HZB) explained how to analyse ancient Elephantine papyri using X-rays and tomography; Amir Rozatian (University of Isfahan) and Fatma Marii (University of Jordan) determined the material of pottery, glass, metal and textiles from Iran and ancient glass from the Petra church; and Gonca Dardeniz Arıkan (Istanbul University) provided an overview of current research into the metallurgy of Iran and Anatolia, the origins of glassmaking, and the future of cultural heritage studies in Turkey. Palaeontology with computed tomography and bioarchaeological samples were highlighted in talks by Kudakwashe Jakata (ESRF) and Kirsi Lorentz (The Cyprus Institute).

During the following discussions, it was clear that institutions devoted to the research, preservation and restoration of materials would benefit from developing research programmes in close cooperation with SESAME. Because of the multiple applications in archaeology, palaeontology, palaeo-environmental science and cultural heritage, it will be necessary to establish a multi-disciplinary working group, which should also share its expertise on practical issues such as handling, packaging, customs paperwork, shipping and insurance. 

Accelerating a better world

IAEA

Tens of thousands of accelerators around the world help create radiopharmaceuticals, treat cancer, preserve food, monitor the environment, strengthen materials, understand fundamental physics, study the past, and even disclose crimes.

A first of its kind international conference, Accelerators for Research and Sustainable Development: From Good Practices Towards Socioeconomic Impact was organised by the International Atomic Energy Agency (IAEA) at its headquarters in Vienna from 23 to 27 May. It was held as a hybrid event attended by around 500 scientists from 72 IAEA member states. While focusing mainly on applications of accelerator science and technology, the conference was geared towards accelerator technologists, operators, users, entrepreneurs, and other stakeholders involved in applications of accelerator technologies as well as policy makers and regulators.

The far-reaching capabilities of accelerator technology help countries progress towards sustainable development

Rafael Mariano Grossi

“The far-reaching capabilities of accelerator technology help countries progress towards sustainable development,” said IAEA director general Rafael Mariano Grossi in his opening address. “IAEA’s work with accelerators helps to fulfil a core part of its ‘Atoms for Peace and Development’ mandate.” He also highlighted how accelerator technology plays a critical role in two IAEA initiatives launched over the past year: Rays of Hope, aimed at improving access to radiotherapy and cancer care in low- and middle-income countries, and NUTEC plastics, supporting countries in addressing plastic waste issues in the ocean and on land. Finally, he described IAEA plans to establish an accelerator of its own: a state-of-the-art ion-beam facility in Seibersdorf, Austria that will support research and help educate and train scientists.

The conference included sessions dedicated to case studies demonstrating socioeconomic impact as well as best practices in effective management, safe operation, and the sustainability of present and future accelerator facilities. It showcased the rich diversity in types of accelerators – from large-scale synchrotrons and spallation neutron sources, or medical cyclotrons and e-beam irradiators used for industrial applications, to smallscale electrostatic accelerators and compact-accelerator based neutron sources – and included updates in emerging accelerator technologies, such as laser-driven neutron and X-ray sources and their future applications. Six plenary sessions featuring 16 keynote talks captured the state of the art in various application domains, accompanied by 16 parallel and two poster sessions by young researchers.

During the summary and highlights session, important developments and future trends were presented:

• Large-scale accelerator facilities under development across the world – notably FAIR in Germany, SPIRAL-2 in France, FRIB in the US, RIBF in Japan, HIAF in China, RAON in Korea, DERICA in Russia and MYRRHA in Belgium – boost the development of advanced accelerator technologies, which are expected to deliver high-impact socioeconomical applications. Substantial interdisciplinary research programmes are foreseen from their beginning, and the IAEA could play an important role by strengthening the links and cooperation between all parties.

• Recent technology developments in Compact-Accelerator Neutron Sources (CANS) or High-Power CANS (HiCANS) are very promising. Among many projects, ERANS at RIKEN in Japan aims to realise a low-cost CANS capable of providing 1012 n/s for applications in materials research and ERANS-III a transportable CANS for testing the structure of bridges. On the HiCANS front, the French SONATE project aims to reach neutron flux levels comparable to the ageing fleet of low and medium power research reactors at least for some applications.

• CANS technology is promising for tools to fight cancer, for example via the Boron Neutron Capture Therapy (BNCT) method. Japan leads the way by operating or constructing 10 such in-hospital based facilities, with only a few other countries, e.g. Finland, considering similar technologies. Recent developments suggest that accelerator based BNCT treatments become soon more acceptable. IAEA could play an important coordinating role and as a technology bridge to developing countries to enable more widespread adoption.

• The role of accelerators in preserving cultural heritage objects and in detecting forgeries is becoming more vital, especially in countries that do not have the required capabilities. Ion-beam analysis and accelerator massspectrometry techniques are of particular relevance, and, again, the IAEA can assist by coordinating actions to disseminate knowledge, educating the relevant communities and possibly centralising the demands for expertise.

• The IAEA could simplify the supply of accelerator technologies between the different member states, enabling the installation and operation of facilities in low- and middle income countries, for example by structuring the scientific and technical accelerators communities, and educating young researchers and technicians via dedicated training schools.

• One of IAEA’s projects is to establish a stateof-the-art ion beam facility in Austria. This will enable applied research and provision of analytical services, as well as help educate and train scientists on the diverse applications of ion beams (including the production of secondary particles such as neutrons) and will enhance collaborations with both developed and developing countries.

• Ion-beam analysis (IBA) together with accelerator-mass spectroscopy (AMS) techniques are unique, reliable and costeffective for Environmental Monitoring and Climate Change Related Studies, for example in characterising environmental samples and investigating isotope ratio studies for chronology and environmental remediation AMS facilities with smaller footprints have increased their distribution worldwide, resulting in accessible and affordable measurements for interdisciplinary research, while other IBA techniques offer efficient analytical methods to characterise the chemical composition of particles from air pollution.

• Materials science and accelerators are now moving ahead hand in hand, from characterisation to modification of technologically important materials including semiconductors, nano-materials, materials for emerging quantum technologies and materials relevant to energy production. Testing materials with accelerator-based light and heavy-ion beams remains a unique possibility in the case of fusion materials and offers much faster radiation-damage studies than irradiation facilities at research reactors. Equally important is the accelerator-assisted creation of gaseous products such as hydrogen and helium that allows testing the radiation resilience in unmoderated neutron systems such as fast fission and fusion reactors.

• New developments in electron-beam accelerators for industrial applications were also mentioned, in particular their application to pollution control. E-beam system technologies are also widely employed in food safety. Reducing spoilage by extending the shelf-life of foods and reducing the potential for pathogens in and on foods will become major drivers for the adoption of these technologies, for which a deeper understanding of the related effects and resistance against radiation is mandatory.

Accelerator technologies evolve very fast, presenting a challenge for regulatory bodies to authorise and inspect accelerator facilities and activities. This conference demonstrated that thanks to recent technological breakthroughs in accelerator technology and associated instrumentation, accelerators are becoming an equally attractive alternative to other sources of ionising radiation such as gamma irradiators or research reactors, among other conventional techniques. Based on the success of this conference, it is expected that the IAEA will start a new series of accelerator community gatherings periodically from now on every two to three years.

 

bright-rec iop pub iop-science physcis connect