A versatile ventilator to help combat COVID-19 developed by members of the LHCb collaboration is to be re-engineered for manufacture and clinical use. The High Performance Low-cost Ventilator (HPLV) is designed to assist patients in low- and middle-income countries suffering from severe respiratory problems as a result of COVID-19. Following the award of £760,000 by UK Research and Innovation, announced in December, Ian Lazarus of the Science and Technology Facilities Council’s Daresbury Laboratory and co-workers aim to produce and test plans for the creation of an affordable, reliable and easy to operate ventilator that does not rely so heavily on compressed gases and mains electricity supply.
“I am proud to be leading the HPLV team in which we have brought together experts from medicine, science, engineering and knowledge transfer with a shared goal to make resilient high-quality ventilators available in areas of the world that currently don’t have enough of them,” said Lazarus in a press release.
While the majority of people who contract COVID-19 suffer mild symptoms, in some cases the disease can cause severe breathing difficulties and pneumonia. For such patients, the availability of ventilators that deliver oxygen to the lungs while removing carbon dioxide is critical. Commercially available ventilators are typically costly, require considerable experience to use, and often rely on the provision of high-flow oxygen and medically pure compressed air, which are not readily available in many countries.
The HPLV takes as its starting point the High Energy physics Ventilator (HEV), which was inspired by an initiative at the University of Liverpool and developed at CERN in March 2019 during the first COVID-19 lockdown. The idea emerged when physicists and engineers in LHCb’s vertex locator (VELO) group realised that the systems which are routinely used to supply and control gas at desired temperatures and pressures in particle-physics detectors are well matched to the techniques required to build and operate a ventilator (CERN Courier May/June 2020 p8). HPLV will see the hardware and software of HEV adapted to make it ready for regulatory approval and manufacture. Project partners at the Federal Institute of Rio de Janeiro in Brazil – in collaboration with CERN, the University of Birmingham, the University of Liverpool and the UK’s Medical Devices Testing and Evaluation Centre – will now identify difficulties encountered when ventilating patients and pass that information to the design team to ensure that the HPLV is fit for purpose.
“We warmly welcome the HPLV initiative, and look forward to working together with the outstanding HPLV team for our common humanitarian goal,” says Paula Collins, who co-leads the HEV project with CERN and LHCb colleague Jan Buytaert. The HPLV is one of several HEV offshoots involving 25 academic partners, she explains. “In December we also saw the first HEV prototypes to be constructed outside CERN, at the Swiss company Jean Gallay SA, which specialises in engineering for aerospace and energy. We have continued our outreach worldwide, and in particular wish to highlight an agreement being built up with a company in India that plans to modify the HEV design for local needs. None of this would have been possible without the incredible support and advice received from the medical community.”
The ability to accelerate charged particles using the “wakefields” of plasma density waves offers the promise of high-energy particle accelerators that are more compact than those based on radio-frequency cavities. Proposed in 1979, the idea is to create a wave inside a plasma upon which electrons can “surf” and gain energy over short distances. Although highly complex, wakefield acceleration (WFA) driven by laser pulses or electron beams has been successfully used to accelerate electron beams to tens of GeV within distances of less than a metre, and the AWAKE experiment at CERN is attempting to achieve higher energy gains by using protons as drive beams. Recent studies suggest that WFA may also occur naturally, potentially offering an explanation for some of the highest energy cosmic rays ever observed.
So-called Fermi acceleration, first conceived by the eponymous Italian in 1949, is considered to be the main mechanism responsible for high-energy cosmic rays. In this process, charged particles are accelerated due to relativistic shockwaves occurring within jets emitted by black-hole binaries, active galactic nuclei or gamma-ray bursts, to name just a few sources. As a charged particle travels within the jet it gets accelerated each time it passes through the shock wave, allowing it to gain energy until the magnetic field in the environment can no longer contain it. This process predicts the observed power-law spectrum of cosmic rays quite well, at least up to energies of around 1019 eV. Beyond this energy, however, Fermi acceleration becomes less efficient as the particles start to lose energy due to collisions and/or synchrotron radiation. The existence of ultra-high-energy cosmic rays (UHECRs), which have been observed up to energies of 1021 eV, indicates that a different acceleration mechanism could be at play in that energy domain. Thanks to its very high efficiency, WFA could provide such a mechanism.
Although there are clearly no laser beams in astrophysical objects, plasma fields that can support waves can be found in many astrophysical settings. For example, in theories developed by Toshiki Tajima of the University of California at Irvine (UCI), one of the inventors of WFA technology, waves could be produced by instabilities in the accretion disks around compact objects such as black holes. These accretion disks can periodically transition from a highly magnetised to a little magnetised state, emitting electromagnetic waves that can propagate into the disk’s jets in the form of Alfven waves. As these waves continue to propagate along the jets they transform back into electromagnetic waves that can accelerate electrons on the front of the plasma’s “bow wake” and protons on the back of it.
Clear predictions
The energies that are theoretically achievable in cosmic-ray WFA depend on the mass of the compact object, as do the periodicities with which such waves can be produced. This allows clear predictions to be made for a range of different objects, which can be tested against observational data.
Groups based at UCI and at RIKEN in Japan recently tested these predictions on a range of astrophysical objects, spanning from 1 to 109 solar masses. Although not conclusive, these first comparisons between theory and observations indicate several interesting features that require further investigation. For example, WFA models predict periodic emission of both UHECRs – the protons on the back of the bow wake – in coincidence with electromagnetic radiation produced by the electrons from the front of the bow wake. Due to interactions with the intergalactic medium, UHECRs are also expected to produce secondary particles, including neutrinos. WFA could thereby also explain periodic outbursts of neutrinos in coincidence with gamma-rays from, for example, blazars, for which evidence was recently found by the IceCube experiment in collaboration with a range of electromagnetic instruments. Additionally, WFA could explain the non-uniformity of the UHECR sky such as that recently reported by the Pierre Auger Observatory (see CERN Courier December 2017 p15), as it allows for cosmic rays with energies up to 1024 eV to be produced within objects that lie within the location of the observed hot-spot.
In concert with future space-based UHECR detectors such as JEM-EUSO and POEMMA, further analysis of existing data should definitively answer the question of whether WFA does indeed occur in space. The clear predictions relating to periodicity, and the coincident emission of neutrinos, gamma-rays and other electromagnetic radiation, make it an ideal subject to study within the multi-messenger frameworks that are currently being set up.
A CERN-based effort to bring about the next generation of hadron-therapy facilities has obtained new funding from the European Commission (EC) to pursue technology R&D. CERN’s Next Ion Medical Machine Study (NIMMS) aims to drive a new European effort for ion-beam therapy based on smaller, cheaper accelerators that allow faster treatments, operation with multiple ions, and patient irradiation from different angles using a compact gantry system. Its predecessor the Proton-Ion Medical Machine Study (PIMMS), which was undertaken at CERN during the late 1990s, underpinned the CNAO (Italy) and MedAustron (Austria) treatment centres that helped propel Europe to the forefront of hadron therapy.
Covering the period 2021–2024, two recently approved EC Horizon 2020 Research Infrastructure projects will support NIMMS while also connecting its activities to collaborating institutes throughout Europe. The multidisciplinary HITRIplus project (Heavy Ion Therapy Research Integration) includes work packages dedicated to accelerator, gantry and superconducting magnet design. The IFAST project (Innovation Fostering in Accelerator Science and Technology) will include activities on prototyping superconducting magnets for ion therapy with industry, together with many other actions related to advanced accelerator R&D.
“Over the past three years we have collected about €4 million of EC contributions, directed to a collaboration of more than 15 partners, representing about a factor of eight leverage on the original CERN funding,” says NIMMS project leader Maurizio Vretenar. “A key achievement was the simultaneous approval of HITRIplus and IFAST because they contain three strong work packages built around the NIMMS work-plan and associate our work with a wide collaboration of institutes.”
A major NIMMS partner is the new South East European International Institute for Sustainable Technologies (SEEIIST), an initiative started by former CERN Director-General Herwig Schopper and former minister of science for Montenegro Sanja Damjanovic, which aims to build a pan-European facility for cancer research and therapy with ions in South East Europe. CNAO and MedAustron are closely involved in the superconducting gantry design, CIEMAT in Spain will build a high-frequency linac section, and INFN is developing new superconducting magnets, with the TERA Foundation continuing to underpin medical-accelerator R&D.
MEDICIS success
Also successful in securing new Horizon 2020 funding is a project built around CERN’s MEDICIS facility, which is devoted to the production of novel radioisotopes for medical research together with institutes in life and medical sciences. The PRISMAP project (the European medical isotope programme) will bring together key facilities in the provision of high-purity-grade new radionuclides to advance early-phase research into radiopharmaceuticals, targeted drugs for cancer, “theranostics” and personalised medicine in Europe.
MEDICIS is now concluding its programme with the separation of 225Ac, a fast-emerging radionuclide for the rising field of targeted alpha therapy.
A successful programme towards this goal was developed by MEDICIS during the past two years, with partner institutes providing sources that were purified on a MEDICIS beamline using mass separation, explains Thierry Stora of CERN. “Our programme was particularly impressive this year, with record separation efficiencies of more than 50% met for 167Tm, the first medical isotope produced at CERN 40 years ago with somewhat lower efficiencies,” he says. “It also allowed the translation of 153Sm, already used in low specific activity grades for palliative treatments, to R&D for new therapeutic applications.” MEDICIS is now concluding its programme with the separation of 225Ac, a fast-emerging radionuclide for the rising field of targeted alpha therapy. “Isotope mass separation at MEDICIS acted as a catalyst for the creation of the European medical isotope programme,” says Stora, who leads the MEDICIS facility.
Together with other project consortia, the MEDICIS and HITRIplus teams are also working to identify the relevance of their research for the EC’s future cancer mission, which is part of its next framework programme, Horizon Europe, beginning this year.
Two further EC Horizon 2020 projects launched by CERN – AIDAinnova, which will enable collaboration on common detector projects, and RADNEXT, which will provide a network of irradiation facilities to test state-of-the-art microelectronics – were approved in November. “These results demonstrate CERN’s outstanding success rate in research-infrastructure projects,” says Svet Stavrev, head of CERN’s EU projects management and operational support section. “Since the beginning of the programme, Horizon 2020 has provided valuable support to major projects, studies and initiatives for accelerator and detector R&D in the particle-physics community.”
The recent update of the European strategy for particle physics (ESPP) offered a unique opportunity for early-career researchers (ECRs) to shape the future of our field. Mandated by the European Committee for Future Accelerators (ECFA) to provide input to the ESPP process, a diverse group of about 180 ECRs were nominated to debate topics including the physics prospects at future colliders and the associated implications for their careers. A steering board comprising around 25 ECRs organised working groups devoted to topics including detector and accelerator physics, and key areas of high-energy physics research. Furthermore, working groups were dedicated to the environment and sustainability, and to human and social factors – aspects that have been overlooked in previous ESPP exercises. A debate took place in November 2019 and a survey was launched to obtain a quantitative understanding of the views raised.
The feedback from these activities was combined into a report reflecting the opinions of almost 120 signed authors. The survey suggests that more than half of the respondents are postdocs, around two-fifths PhD students and approximately a tenth staff members. Moreover, roughly one-third were female and two-thirds male. Several areas, such as which collider should follow the LHC and environmental and sustainability considerations, were highlighted by the participating ECRs. Among the many topics discussed, we highlight here a handful of aspects that we feel are key to the future of our field.
Building a sustainable future
A widespread concern is that the attractiveness of our field is at risk, and that dedicated actions need to be taken to safeguard its future. Certain areas of work are vital to the field, but are undervalued, resulting in shortages of key skills. Due to significant job insecurity many ECRs struggle to maintain a healthy work–life balance. Moreover, the lack of attractive career paths in science, compared to the flexible working hours and family-friendly policies offered by many companies these days, potentially compromises the ability of our field to attract and retain the brightest minds in the short- and long-term future. With the funding for the proposed Future Circular Collider (a key pillar of the ESPP recommendations) not yet clear, and despite it receiving the largest support among future-collider scenarios in CERN’s latest medium-term financial plan, an additional risk arises for ECRs to back the wrong horse.
The future of the field will depend on the success of reaching a diverse community
It is imperative to holistically include social and human factors when planning for a sustainable future of our field. Therefore, we strongly recommend that long-term project evaluations and strategy updates assess and include the impact of their implementation on the situation of young academics. Specifically, equal recognition and career paths for domains such as computing and detector development have to be established to maintain expertise in the field.
Next-generation colliders beyond the LHC will need to overcome major technical challenges in detector physics, software and computing to meet their ambitious physics goals. Our survey and debate showed that young researchers are concerned about a shortage of experts in these domains, where very few staff positions and even less professorships are open for particle physicists specialised in detector development and software and computing. In particular in the light of ever increasing project time scales, a sizable fraction of funding for non-permanent positions must be converted to funding for permanent positions in order to establish a sustainable ratio between fixed-term postdocs and staff scientists.
The possibility for a healthy work–life balance and the reconciliation of family and a scientific career is a must: currently, most of the ECRs consulted think that having children could damage their future and that moving between countries is generally a requirement to pursue a career in particle physics. These might constitute two reasons why only 20% of the polled ECRs have children. Put in a broader perspective, the future of the field will depend on the success of reaching a diverse community, with viable career paths for a wide spectrum of schemes of life. In order to reach this diverse community, it is not enough to simply offer more day-care places to parents. Similarly, the #BlackInTheIvory movement in 2020 shone a spotlight on the significant barriers faced by the Black community in academia – an issue also shared by many other minority groups. Discrimination in academia has to be counteracted systematically, including the filling of positions or grant-approval processes, where societal and diversity aspects must be taken into account with high priority.
The environmental sustainability of future projects is a clear concern for young researchers, and particle-physics institutes should use their prominent position in the public eye to set an example to other fields and society at large. The energy efficiency of equipment and the power consumption of future collider scenarios are considered only partially in the ESPP update, and we support the idea of preparing a more comprehensive analysis that includes the environmental impact of the construction as well as the disposal of large infrastructures. There should be further discussion of nuclear versus renewable energy usage and a concrete plan on how to achieve a higher renewable energy fraction. The ECRs were also of the view that much travel within our field is unnecessary, and that ways to reduce this should be brought to the fore. Since the survey was conducted, due to the ongoing COVID-19 pandemic, various conferences have already moved online, proving that progress can be made on this front.
Collider preference
In the context of the still-open questions in particle physics and potential challenges of future research programmes, the ECRs find dark matter, electroweak symmetry breaking and neutrino physics to be the three most important topics of our field. They also underline the importance of a European collider project soon after the completion of the HL-LHC. Postponing the choice of the next collider project at CERN to the 2030s, for example, would potentially negatively impact the future of the field: there could be fewer permanent jobs in detector physics, computing and software if preparations for future experiments cannot begin after the current upgrades. Additionally, it could be difficult to attract new, young bright minds into the field if there is a gap in data-taking after the LHC. While physics topics were already discussed in great detail during the broader ESPP process, many ECRs stated their discomfort about the way the next-generation scenarios were compared, especially by how the different states of maturity of the projects were not sufficiently taken into account.
About 90% of ECRs believe that the next collider should be an electron–positron machine
About 90% of ECRs believe that the next collider should be an electron–positron machine, concurring with the ESPP recommendations, although there is not a strong preference if this machine is linear or circular. While there was equal preference for CLIC and FCC-ee as the next-generation collider, a clear preference was expressed for the full FCC programme over the full CLIC programme. Given the diverse interest in future collider scenarios, and keeping in mind the unclear situation of the ILC, we strongly believe that a robust and diverse R&D programme on both accelerators and detectors must be a high priority for the future of our field.
In conclusion, both the debate and the report were widely viewed as a success, with extremely positive feedback from ECFA and the ECRs. Young researchers were able to share their views and concerns for the future of the field, while familiarising themselves with and influencing the outcome of the ESPP. ECFA has now established a permanent panel of ECRs, which is a major milestone to make such discussions among early-career researchers more regular and effective in the future.
Charm and beauty quarks are excellent probes of the hot and dense state of deconfined quarks and gluons (quark–gluon plasma, QGP) which is created in high-energy heavy-ion collisions. These heavy quarks are produced in hard-scattering processes at the early stages of the collisions, and interact with the constituents of the newly created QGP through both elastic and inelastic processes. These quarks, which can be studied through their decays into leptons, lose energy while propagating through the QGP medium. Consequently, different production yields are observed at large momenta in nucleus–nucleus collisions compared to proton–proton collisions. This effect can be quantified using the nuclear modification factor, RAA, which is the ratio of nucleus–nucleus and proton–proton particle yields, scaled by the average number of binary nucleon–nucleon collisions. Comparing measurements in different collision systems sheds light on heavy-quark energy-loss mechanisms, and provides high-precision tomography of the QGP.
The results show that collision geometry plays an important role in heavy-quark energy loss
A new analysis by the ALICE collaboration compares the production of leptons from heavy-flavour hadron decays in Pb–Pb and Xe–Xe collisions at √sNN = 5.02 and 5.44 TeV, respectively. The measurements use the muon and electron decay channels at forward rapidity and mid-rapidity. The results show that collision geometry plays an important role in heavy-quark energy loss.
Remarkable agreement
A remarkable agreement is observed between the muon yields in head-on Xe–Xe collisions and slightly offset Pb–Pb collisions (figure 1, left). Given the larger size of the lead nucleus, these collision centrality classes – 0–10% and 10–20%, respectively – give rise to similar charged-particle multiplicities, and thus suggest the creation of similar QGP densities and sizes in the colliding systems.
In both cases, the production of muons from heavy-flavour hadron decays is suppressed up to a factor of about 2.5 for 5 GeV < pT < 6 GeV. This suppression is successfully reproduced by the MC@sHQ+EPOS2 model, which considers both elastic and inelastic energy-loss processes of the heavy quarks in the QGP, but is underestimated by the PHSD model, which only includes elastic processes. The analysis also saw ALICE’s first sensitivity down to pT = 0.2 GeV using a lower magnetic field (0.2 T) in the solenoid magnet (figure 1, right). The suppression pattern for muons and electrons from heavy-flavour hadron decays is similar at both forward and mid-rapidity, indicating that heavy quarks strongly interact with the medium over a wide rapidity interval. The suppression is smaller in these “glancing” semi-central collisions than in the previously discussed head-on collisions. This is compatible with the hypothesis that the in-medium energy loss depends on the energy density and on the size of the system created in the collision.
The precision of the measurements brings new insights into the nature of parton energy loss and new constraints to the modelling of its dependence on the size of the QGP medium in transport-model calculations. Further constraints will be set by future higher precision measurements during Run 3, when ALICE will measure leptons from charm and beauty decays separately, at both central and forward rapidity. A short run with the much smaller oxygen–oxygen system may also be scheduled and contribute to a deeper understanding of the dependence of system size on in-medium energy loss for heavy quarks.
The recent Future Circular Collider (FCC) workshop, held online from 9 to 13 November, brought together roughly 500 scientists, engineers and stakeholders to prepare a circular-collider-oriented roadmap towards the realisation of the vision of the European strategy for particle physics: to prepare a Higgs factory followed by a future hadron collider with sensitivity to energy scales an order of magnitude higher than at the LHC.
The meeting combined the fourth FCC physics week with the kick-off event for the EU-funded Horizon 2020 FCC Innovation Study (FCCIS). A successor to the previous EuroCirCol project, which was completed in 2019 and supported the preparation of the FCC conceptual design report (CDR), it will support the preparation of a feasibility study of a 100 km-circumference collider that could host an intensity- frontier electron–positron Higgs and electroweak factory (FCC-ee), followed by a 100 TeV energy-frontier hadron collider (FCC-hh) – an integrated scheme that EuroCirCol showed to be doable “in principle”. Key advantages of the FCC design are the multiple interaction points, high beam luminosities and long-term science mission covering both precision and energy frontiers over several decades (see FCC-ee: beyond a Higgs factory). The design must now be validated. “The feasibility study of FCC is particularly challenging and will require the hard work, dedication and enthusiasm of the full FCC community,” noted CERN Director-General Fabiola Gianotti.
Unprecedented capabilities
The main goal of the study, said FCC-study project-leader Michael Benedikt, is to demonstrate the practical feasibility of delivering the unprecedented luminosities and precise energy-calibration capabilities of the proposed electroweak factory in a modular fashion. The study will also incorporate a socio-economic impact analysis and an implementation plan for an infrastructure that could fit in the global research landscape, he said. The feasibility study – a “CDR++” – will be prepared by 2025/2026, in time for the next strategy update.
A key consideration for FCC-ee that was discussed at the meeting is the development of a complete collider design with full beam-dynamic simulations and a complete injector. Continuous top-up injection, from a full-energy booster ring installed next to the collider, will lead to stable operation and maximum integrated luminosity, offering availability for physics runs of more than 80%. A series of tests in research facilities around Europe, including at PETRA-III (DESY), KARA (KIT), DAΦNE (Frascati), and potentially other facilities such as VEPP-4M (BINP), will provide the opportunity to validate the concepts. Developing a staged superconducting radio-frequency system is another major challenge. Multi-cell 400 MHz Nb/Cu cavities required for the Higgs-factory operation mode will be available within five years, alongside a full cryomodule. A mock-up of a 25 m-long full-arc half-cell of the FCC-ee is expected for 2025. Such cells will cover about 80 km of FCC-ee’s 100 km circumference.
Physics-analysis questions were also at the forefront of participants’ minds. “We are confronted with three deep and pressing questions when we observe our universe,” noted ECFA chair Jorgen D’Hondt. “What is the mechanism responsible for the transition from massless to massive particles? What are the processes that lead to the breaking of symmetry between particles and antiparticles? And how is the observed universe connected to what remains invisible to us?” Theorist Christopher Grojean (DESY) showed that electroweak, Higgs and flavour data from FCC-ee, in conjunction with astrophysical and cosmological observations, have the potential to break through the armour of the Standard Model and begin to tackle these questions. Discussions explored the need to halve theoretical uncertainties and hone detector designs to match the high statistical precision offered by the FCC-ee, and the possibility of complementing FCC-ee with a linear collider such as the proposed International Linear Collider, which could access higher energies.
Strong message
The November FCC workshop paved the way for progress beyond the state-of-the-art in a variety of areas that could ensure the sustainable and efficient realisation of a post-LHC collider. A strong message from the workshop was that the FCC feasibility study must be a global endeavour that attracts industrial partners to co-develop key technologies, and inspires the next generation of particle physicists.
In June 2020, the CMS collaboration submitted a paper titled “Observation of the production of three massive gauge bosons at √s= 13 TeV” to the arXiv preprint server. A scientific highlight in its own right, the paper also marked the collaboration’s thousandth publication. ATLAS is not far from reaching the same milestone, currently at 964 publications. With the rest of the LHC experiments taking the total number of papers to 2852, the first ten years of LHC operations have generated a bumper crop of new knowledge about the fundamental particles and interactions.
The publication landscape in high-energy physics (HEP) is very exceptional due to a long-held preprint culture. From the 1950s paper copies were kept in the well-known red cabinets outside the CERN Library (pictured), but since 1991 they have been stored electronically at arXiv.org. Preprint posting and actual journal publication tend to happen in parallel, and citations between all types of publications are compiled and counted in the INSPIRE system.
Particle physics has been at the forefront of the open-science movement, in publishing, software, hardware and, most recently, data. In 2004, former Director-General Robert Aymar encouraged the creation of SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) at CERN. Devoted to converting closed access HEP journals to open access, it has grown extensively and now has over 3000 libraries from 44 different countries. All original LHC research results have been published open access. The first collaboration articles by the four main experiments, describing the detector designs, and published in the Journal of Instrumentation, remain amongst the most cited articles from LHC collaborations and — despite being more than a decade old — are some of the most recently read articles of the journal.
Closer analysis
Since then, along with the 2852 publications by CERN’s LHC experiments, a further 380 papers have been written by individuals on behalf of the collaboration, and another 10,879 articles (preprints, conference proceedings, etc.) from the LHC experiments that were not published in a journal. However, this only represents part of the scientific relevance of the LHC. There were tens of thousands of papers published over the past decade that write about the LHC experiments, use their data or are based on the LHC findings. The papers published by the four experiments received on average 112 citations per paper, compared to an average of 41 citations per paper across all experimental papers indexed in INSPIRE and even 30 citations per paper across all HEP publications (4.8 million citations across 163,000 documents since 2008). Unsurprisingly, the number of citations peaks with the CMS and ATLAS papers on the Higgs discovery, with 10,910 and 11,195 citations respectively, which at the end of 2019 were the two most cited high-energy physics papers released in the past decade.
Large author numbers are another exceptional aspect of LHC-experiment publishing, with papers consistently carrying hundreds or even thousands of names. This culminated in a world record of 5,154 authors on a joint paper between CMS and ATLAS in 2015, which reduced the uncertainty on the measurement of the Higgs-boson mass to ±0.25%.
Teasing fluctuations
Ten years of LHC publications have established the Standard Model at unprecedented levels of precision. But they also reveal the hunger for new physics, as illustrated by the story of the 750 GeV diphoton ‘bump’. On 15 December 2015, ATLAS and CMS presented an anomaly in data that showed an excess of events at 750 GeV in proton collisions, fueling rumours a new particle could be showing itself. While the significance of the excess was only 2σ and 1.6σ respectively, theorists were quick to respond with an influx of hundreds of papers (see “750 shades of model building”). This excitement was however damped by the release of the August 2016 data, where there was no further sign of the anomaly, and it became commonly recognised as a statistical fluctuation – part and parcel of the scientific process, if ruining the fun for the theorists.
With the LHC to continue operations to the mid-2030s, and only around 6% of its expected total dataset collected so far, we can look forward to thousands more publications about nature’s basic constituents being placed in the public domain.
The significant increase in luminosity targeted by the high-luminosity LHC (HL-LHC) demands large-aperture quadrupole magnets that are able to focus the proton beams more tightly as they collide. A total of 24 such magnets are to be installed on either side of the ATLAS and CMS experiments in time for HL-LHC operations in 2027, marking the first time niobium-tin (Nb3Sn) magnet technology is used in an accelerator.
Nb3Sn is a superconducting material with a critical magnetic field that far exceeds that of the niobium-titanium presently used in the LHC magnets, but once formed it becomes brittle and strain-sensitive, which makes it much more challenging to process and use.
The milestone signals the end of the prototyping phase for the HL-LHC quadrupoles
Giorgio Apollinari
Following the first successful test of a US-built HL-LHC quadrupole magnet at Brookhaven National Laboratory (BNL) in January last year—attaining a conductor peak field of 11.4 T and exceeding the required integrated gradient of 556 T in a 150 mm-aperture bore—a second quadrupole magnet has now been tested at BNL at nominal performance. Since the US-built quadrupole magnets must be connected in pairs before they can constitute fully operational accelerator magnets, the milestone signals the end of the prototyping phase for the HL-LHC quadrupoles, explains Giorgio Apollinari of Fermilab, who is head of the US Accelerator Upgrade Projects (AUP). “The primary importance is that we have entered the ‘production’ period that will make installation viable in early 2025. It also means we have satisfied the requirements from our funding agency and now the US Department of Energy has authorised the full construction for the US contribution to HL-LHC.”
Joint venture
The design and production of the HL-LHC quadrupole magnets are the result of a joint venture between CERN, BNL, Fermilab and Lawrence Berkeley National Laboratory, preceded by the 15 year-long US LHC Accelerator Research Program (LARP). The US labs are to provide a total of ten 9 m-long helium-tight vessels (eight for installation and two as spares) for the HL-LHC, each containing two 4.2 m-long magnets. CERN is also producing ten 9 m-long vessels, each containing a 7.5 m-long magnet. The six magnets to be placed on each side of ATLAS and CMS – four from the US and two from CERN – will be powered in series on the same electrical circuit.
The synergy between CERN and the US laboratories allowed us to considerably reduce the risks
Ezio Todesco
“The synergy between CERN and the US laboratories allowed us to considerably reduce the risks, have a faster schedule and a better optimisation of resources,” says Ezio Todesco of CERN’s superconductors and cryostats group. The quadrupole magnet programme at CERN is also making significant progress, he adds, with a short-model quadrupole having recently reached a record 13.4 T peak field in the coil, which is 2 T more than the project requirements. “The full series of magnets, sharing the same design and built on three sites, will also give very relevant information about the viability of future hadron colliders, which are expected to rely on massive, industrial production of Nb3Sn magnets with fields up to 16 T.”
Since the second US quadrupole magnet was tested in October, the AUP teams have completed the assembly of a third magnet and are close to completing the assembly of a fourth. Next, the first two magnets will be assembled in a single cold mass before being tested in a horizontal configuration and then shipped to CERN in time for the “string test” planned in 2023.
“In all activities at the forefront of technology, like in the case for these focusing Nb3Sn quadrupoles, the major challenge is probably the transition from an ‘R&D mentality’, where minor improvements can be a daily business, to a ‘production mentality’, where there is a need to build to specific procedures and criteria, with all deviations being formally treated and corrected or addressed,” says Apollinari. “And let’s not forget that the success of this second magnet test came with a pandemic raging across the world.”
After seven years of construction at the Joint Institute for Nuclear Research (JINR) in Dubna, Russia, the Booster synchrotron at the brand-new NICA (Nuclotron-Based Ion Collider Facility) Complex has accelerated its first beam. On 19 December helium ions were injected into the synchrotron and a stable circulation of the beam was obtained at an energy of 3.2 MeV. The milestone marks an important step in establishing the NICA facility, which is estimated to be completed by 2022.
At this energy, ordinary matter and the quark-gluon plasma coexist in a mixed phase
The NICA accelerator complex will allow studies of the properties of nuclear matter in the region of maximum baryonic density. By colliding heavy gold ions at energies corresponding to the deconfinement phase transition (4.5 GeV), NICA will access the transition of the quark-gluon plasma (QGP) into hadrons. At this energy, ordinary matter and the QGP are able to exist in a so-called mixed phase – complementing studies at higher energy colliders such as the LHC.
The NICA booster is a 211 m circumference superconducting synchrotron which will accelerate beams to 500 MeV. It uses 2.2 m-long dipole and quadrupole magnets made up of a window frame iron yoke and a winding made of a hollow niobium-titanium superconducting cable cooled with a two-phase helium flow. Beams will then be transported to a separate ring surrounding the booster, the “nuclotron”, and accelerated to the GeV range. The nuclotron was originally built between 1987 and 1992 as part of the Dubna “syncrophasotron modernisation” programme, and was Europe’s first superconducting accelerator of heavy ions to high energies. Finally, beams will be injected into two identical 503 m storage rings, which will collide the beams at two detectors: the Multi-Purpose Detector (MPD) and the Spin-Physics Detector (SPD). The MPD facility is designed to study dense baryonic matter, while SPD will study collisions between polarised beams of protons and deuterons.
The complex is one of six Russian “megascience” facilities that are part of the CREMLIN project, which aims to use large-scale science facilities to improve and strengthen relations and networks between European and Russian research infrastructures. The CREMLIN consortium comprises 19 European and Russian research infrastructures, including CERN, and DESY. Other “megascience” facilities included in this project are the Super-Charm-Tau Factory at the Budker Institute of Nuclear Physics, and the Special-purpose Synchrotron-Radiation Source (SSRS-4) at the NRC Kurchatov Institute.
“This is a historical moment for our Laboratory and a great milestone in realization of our flagship megascience project – we have to thank the grant programme CREMLIN helping us in these challenges,” says Vladimir Kekelidze, the NICA project leader. “The final step before the physical launch of the Booster will be the adjustment of the beam acceleration mode, which will then allow focus to switch to the construction of the beam transport systems from the Booster to the Nuclotron.”
Materials exposed to the high-energy beams in a particle accelerator must fulfil a demanding checklist of mechanical, electrical and vacuum requirements. While the structural function comes from the bulk materials, many other properties are ascribed to a thin surface layer, sometimes just a few tens of nanometres thick. This is typically the case for the desorption caused by electron, photon and ion collisions; Joule-effect heating induced by the electromagnetic field associated with the particle beams; and electron multipacting phenomena (see “Collaboration yields vacuum innovation”). To deliver the required performance, dedicated chemical and electrochemical treatments are needed – and more often than not mandatory – to re-engineer the physical and chemical properties of vacuum component/subsystem surfaces.
The bigger drivers here are the construction and operation of the Large Hadron Collider (LHC) and the High-Luminosity LHC upgrade – projects that, in turn, have driven impressive developments in CERN’s capabilities and infrastructure for surface chemistry and surface modification. The most visible example of this synergy is the new Building 107, a state-of-the-art facility that combines a diverse portfolio of chemical and electrochemical surface treatments with a bulletproof approach to risk management for personnel and the environment. Operationally, that ability to characterise, re-engineer and fine-tune surface properties has scaled dramatically over the last decade, spurred by the recruitment of a world-class team of scientists and engineers, the purchase of advanced chemical processing systems, and the consolidation of our R&D collaborations with specialist research institutes across Europe.
Chemistry in action
Within CERN’s Building 107, an imposing structure located on the corner of Rue Salam and Rue Bloch, the simplest treatment to implement – as well as the most common – is chemical surface cleaning. After machining and handling, any accelerator component will be contaminated by a layer of dirt – mainly organic products, dust and salts. Successful cleaning requires the right choice of materials and production strategy. A typical error in the design of vacuum components, for example, is the presence of surfaces that are hidden (and so difficult to clean) or holes that cannot be rinsed or dried fully. Standard cleaning methods to tackle such issues are based on detergents that, in aqueous solution, will lower the surface tensions and so aid the rinsing of foreign materials like grease and dust.
Successful cleaning requires the right choice of materials and production strategy
The nature of the accelerator materials means there are also secondary effects of cleaning that must be considered at the design phase – e.g. removal of the oxide layer (pickling) for copper and etching for aluminium alloys. To improve the cleaning process, we apply mechanical agitation via circulation of cleaning fluids, oscillation of components and ultrasonic vibration. The last of these creates waves at a frequency higher than 25 kHz. In the expansion phase of the liquid waves, microbubbles of vapour are generated (cavitation), while in the compression phase the bubbles implode to generate pressures of around 1000 bar at the equipment surface – a pressure so high that the material can be eroded (though the higher the frequency, the smaller the gas bubbles and the less aggressive the surface interaction).
Chemical fine-tuning
An alternative cleaning method is based on non-aqueous solvents that act on contamination by dilution. Right now, modified alcohols are the most commonly used solvents at CERN – a result of their low selectivity and minimal toxicity – with the cleaning operation performed in a sealed machine to minimise the environmental impacts of the volatile chemicals. While the range of organic products on which solvents are effective is usually wider than that of detergents, they cannot efficiently remove polar contaminants like salt stains. Another drawback is the risk of contaminants recollecting on the component surface when the liquid does not flow adequately.
Ultimately, the choice of detergent versus solvent relies on the experience of the operator and on guidelines linked to the type of vacuum component and the nature of the contamination. In general, the coating of components destined for ultrahigh-vacuum (UHV) applications will require a preliminary cleaning phase with detergents. Meanwhile, solvents are the optimum choice when there are no stringent cleanliness requirements – e.g. degreasing of filters for cryoplants or during the component assembly phase – and for surfaces that are prone to react with or retain water – e.g. steel laminations for magnets, ceramics and welded bellows. (It is worth noting that trapped water is released in vacuum, compromising the achievement of the required pressure, while wet surfaces are seeds for corrosion in air.)
After rinsing and drying, the components are then ready for installation in the accelerator or for ongoing surface modification. In the case of the latter, the chemical treatments aim to generate a thinner, more compact oxide layer and/or a smoother surface – essential for subsequent plating processes. As such, the components can undergo etching, pickling and passivation (to reduce the chemical reactivity of the surface). Consider the copper components for the LHC’s current-lead support: before brazing (a joining process using a melted filler metal), these components are pickled in hydrochloric acid and passivated in chromic acid. Similarly, the aluminium contacts of busbars (for local high-current power distribution) must be pickled by caustic soda and/or a mixture of nitric and hydrofluoric acid before silver coating. Another instructive example is found in the LHCb’s Vertex Locator (VELO) detector, in which the aluminium RF-box window is thinned down to 150 microns by caustic soda.
Safety-critical thinking is hard-wired into the operational DNA of CERN’s Building 107, underpinning the day-to-day storage, handling and large-scale use of chemical products for surface treatments. That safety-first mantra means the 5000 m2 facility is able to confine all hazards inside its walls, such that risks for the surrounding neighbourhood and environment are negligible. Among the key features of Building 107:
• There are retention basins that allow containment of the liquid from all surface-treatment tanks (plus, even in the unlikely case of a fire, there is enough retention capacity for the water pumped by the firefighting teams).
• The retention basins have leak detection sensors, pumping systems, buffer tanks and a special coating that’s able to withstand more than 100 types of chemical for several days in the event of a leak.
• Toxic and corrosive vapours are extracted continuously from the tanks and washed in dedicated scrubbers, while any escaped solvents are adsorbed on active carbon filters.
• A continuous spray of alkaline solution transfers toxic products (liquid phase) for decontamination at CERN’s wastewater treatment plant.
• In terms of fire prevention, all plastics used for the treatment tanks and extraction ducts are made of self-extinguishing polypropylene – removing the source of energy to sustain the flames.
• The safety of technicians is ensured by strict operating procedures (including regulated building access), enhanced air extraction and the storage of incompatible products in separate retention zones.
• State-of-the-art sensors provide permanent monitoring of critical airborne products and link to local and fire-brigade alarms.
Frequently, chemical or electrochemical polishing are required in addition to cleaning. Polishing removes the damaged subsurface layer generated by lamination and machining – essentially a tangle of voids, excess dislocations and impurities. In this context, it is worth highlighting the surface treatments for RF acceleration cavities. Best practice dictates that materials for such applications – essentially niobium and copper – must undergo chemical and/or electrochemical polishing to remove a surface layer of 150 micron thickness. As such, the final state of the material’s topmost layer is flawless and without residual stress. (Note that while mechanical polishing can achieve lower roughness, it leaves behind underlayer defects and abrasive contaminations that are incompatible with the high-voltage operation of RF cavities.) A related example is the niobium RFD crab cavity for the HL-LHC project. This complex-shaped object is treated by a dedicated machine that can provide rotation while chemically polishing with a mixture of nitric, hydrofluoric and phosphoric acids. In this chemical triple-whammy, the first acid oxidises niobium; the second fluorinates and “solubilises” the oxide; and the last acts as a buffer controlling the reaction rate.
Another intriguing opportunity is the switch from wet to dry chemistry for certain niche applications
The final set of treatments involves plating the component with a functional material. In outline, this process works by immersing the accelerator component (negatively biased) into an electrolytic solution containing the functional metal ions. The electrolytic solution is strongly acid or basic to ensure high electrical conductivity, with deposition occurring via reduction of the metallic ions on the component surface – all of which occurs in dedicated tanks where the solution is heated, agitated and monitored throughout.
At CERN, we have extensive experience in the electroplating of large components and can plate with copper, silver, nickel, gold and rhodium. Copper is by far the most common option and its thickness is frequently of the order of hundreds of microns (while gold and rhodium are rarely thicker than a few microns). Current capacity varies from 7 m-long pipes (around 10 cm diameter) to 3.5 m-long tanks (up to 0.8 m diameter). It is worth noting that these capabilities are also used to support other big-science facilities – including a recent implementation for the Drift Tube Linac tanks of the European Spallation Source (ESS) in Lund, Sweden.
Chemical innovation
Notwithstanding the day-to-day provision of a range of surface treatments, the Building 107 chemistry team is also tasked with driving process innovation. As safety is our priority, the main focus is on the replacement of toxic products with eco- and personnel-friendly chemicals. A key challenge in this regard is to substitute chromic acid and cyanate baths, and ideally limit the current extensive use of hydrofluoric acid – a development track inextricably linked to the commercialisation of new products and close cooperation with our partners in industry.
Elsewhere, the chemistry team has registered impressive progress on several fronts. There’s the electroforming of tiny vacuum chambers for electron accelerators and RF cavities with seamless enclosure of flanges at the extremities. This R&D project is supported by CERN’s knowledge transfer funds and has already been proposed for the prototyping of the vacuum chamber of the Swiss Light Source II. A parallel line of enquiry includes production of self-supported graphite films for electron strippers that increase the positive charge of ions in beams – with the films fabricated either by etching the metallic support or by electrochemical delamination (a technique already proposed for the production of graphene foils).
Another intriguing opportunity is the switch from wet to dry chemistry for certain niche applications. A case in point is the use of oxygen plasmas for surface cleaning – a technique hitherto largely confined to industry but with one notable exception in accelerator science. The beryllium central beam pipes of the four main LHC experiments, for example, were cleaned by oxygen plasma before non-evaporable-getter coating, removing carbon contamination without dislodging atoms of the hazardous metal. Following on from this successful use case, we are presently studying oxygen plasmas for in situ decontamination and cleaning of radioactive components, a priority task for the chemistry team as the HL-LHC era approaches.
The future of surface chemistry at CERN looks bright – and noticeably greener. The Building 107 team, for its part, remains focused on developing chemical surface treatments that are, first and foremost, safer and, in some cases, drier.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.