Comsol -leaderboard other pages

Topics

CERN’s impact on medical technology

Hadron-therapy beam

Today, the tools of experimental particle physics are ubiquitous in hospitals and biomedical research. Particle beams damage cancer cells; high-performance computing infrastructures accelerate drug discoveries; computer simulations of how particles interact with matter are used to model the effects of radiation on biological tissues; and a diverse range of particle-physics-inspired detectors, from wire chambers to scintillating crystals to pixel detectors, all find new vocations imaging the human body.

CERN has actively pursued medical applications of its technologies as far back as the 1970s. At that time, knowledge transfer happened – mostly serendipitously – through the initiative of individual researchers. An eminent example is Georges Charpak, a detector physicist of outstanding creativity who invented the Nobel-prize-winning multiwire proportional chamber (MWPC) at CERN in 1968. The MWPC’s ability to record millions of particle tracks per second opened a new era for particle physics (CERN Courier December 1992 p1). But Charpak strived to ensure that the technology could also be used outside the field – for example in medical imaging, where its sensitivity promised to reduce radiation doses during imaging procedures – and in 1989 he founded a company that developed an imaging technology for radiography which is currently deployed as an orthopaedic application. Following his example, CERN has continued to build a culture of entrepreneurship ever since.

Triangulating tumours

Since as far back as the 1950s, a stand-out application for particle-physics detector technology has been positron-emission tomography (PET) – a “functional” technique that images changes in the metabolic process rather than anatomy. The patient is injected with a compound carrying a positron-emitting isotope, which accumulates in areas of the body with high metabolic activity (the uptake of glucose, for example, could be used to identify a malignant tumour). Pairs of back-to-back 511 keV photons are detected when a positron annihilates with an electron in the surrounding matter, allowing the tumour to be triangulated.

Colour X-ray of a mouse

Pioneering developments in PET instrumentation took place in the 1970s. While most scanners were based on scintillating crystals, the work done with wire chambers at the University of California at Berkeley inspired CERN physicists David Townsend and Alan Jeavons to use high-density avalanche chambers (HIDACs) – Charpak’s detector plus a photon-conversion layer. In 1977, with the participation of CERN radiobiologist Marilena Streit-Bianchi, this technology was used to create some of the first PET images, most famously of a mouse. The HIDAC detector later contributed significantly to 3D PET image reconstruction, while a prototype partial-ring tomograph developed at CERN was a forerunner for combined PET and computed tomography (CT) scanners. Townsend went on to work at the Cantonal Hospital in Geneva and then in the US, where his group helped develop the first PET/CT scanner, which combines functional and anatomic imaging.

Crystal clear

In the onion-like configuration of a collider detector, an electromagnetic calorimeter often surrounds a descendant of Charpak’s wire chambers, causing photons and electrons to cascade and measuring their energy. In 1991, to tackle the challenges posed by future detectors at the LHC, the Crystal Clear collaboration was formed to study innovative scintillating crystals suitable for electromagnetic calorimetry. Since its early years, Crystal Clear also sought to apply the technology to other fields, including healthcare. Several breast, pancreas, prostate and animal-dedicated PET scanner prototypes have since been developed, and the collaboration continues to push the limits of coincidence-time resolution for time-of-flight (TOF) PET. 

In TOF–PET, the difference between the arrival times of the two back-to-back photons is recorded, allowing the location of the annihilation along the axis connecting the detection points to be pinned down. Better time resolution therefore improves image quality and reduces the acquisition time and radiation dose to the patient. Crystal Clear continues this work to this day through the development of innovative scintillating-detector concepts, including at a state-of-the-art laboratory at CERN.

The dual aims of the collaboration have led to cross-fertilisation, whereby the work done for high-energy physics spills over to medical imaging, and vice versa. For example, the avalanche photodiodes developed for the CMS electromagnetic calorimeter were adapted for the ClearPEM breast-imaging prototype, and technology developed for detecting pancreatic and prostate cancer (EndoTOFPET-US) inspired the “barrel timing layer” of crystals that will instrument the central portion of the CMS detector during LHC Run 3.

Pixel perfect

In the same 30-year period, the family of Medipix and Timepix read-out chips has arguably made an even bigger impact on med-tech and other application fields, becoming one of CERN’s most successful technology-transfer cases. Developed with the support of four successive Medipix collaborations, involving a total of 37 research institutes, the technology is inspired by the high-resolution hybrid pixel detectors initially developed to address the challenges of particle tracking in the innermost layers of the LHC experiments. In hybrid detectors, the sensor array and the read-out chip are manufactured independently and later coupled by a bump-bonding process. This means that a variety of sensors can be connected to the Medipix and Timepix chips, according to the needs of the end user.

Visualisation of energy deposition

The first Medipix chip produced in the 1990s by the Medipix1 collaboration was based on the front-end architecture of the Omega3 chip used by the half-million-pixel tracker of the WA97 experiment, which studied strangeness production in lead–ion collisions. The upgraded Medipix1 chip also included a counter per pixel. This demonstrated that the chips could work like a digital camera, providing high-resolution, high-contrast and noise-hit-free images, making them uniquely suitable for medical applications. Medipix2 improved spatial resolution and produced a modified version called Timepix that offers time or amplitude measurements in addition to hit counting. Medipix3 and Timepix3 then allowed the energy of each individual photon to be measured – Medipix3 allocates incoming hits to energy bins in each pixel, providing colour X-ray images, while Timepix3 times hits with a precision of 1.6 ns, and sends the full hit data – coordinate, amplitude and time – off chip. Most recently, the Medipix4 collaboration, which was launched in 2016, is designing chips that can seamlessly cover large areas, and is developing new read-out architectures, thanks to the possibility of tiling the chips on all four sides.

Medipix and Timepix chips find applications in widely varied fields, from medical imaging to cultural heritage, space dosimetry, materials analysis and education. The industrial partners and licence holders commercialising the technology range from established enterprises to start-up companies. In the medical field, the technology has been applied to X-ray CT prototype systems for digital mammography, CT imagers for mammography, and beta- and gamma-autoradiography of biological samples. In 2018 the first 3D colour X-ray images of human extremities were taken by a scanner developed by MARS Bioimaging Ltd, using the Medipix3 technology. By analysing the spectrum recorded in each pixel, the scanner can distinguish multiple materials in a single scan, opening up a new dimension in medical X-ray imaging: with this chip, images are no longer black and white, but in colour (see “Colour X-ray” image).

Although the primary aim of the Timepix3 chip was applications outside of particle physics, its development also led directly to new solutions in high-energy physics, such as the VELOpix chip for the ongoing LHCb upgrade, which permits data-driven trigger-free operation for the first time in a pixel vertex detector in a high-rate experiment. 

Dosimetry

CERN teams are also exploring the potential uses of Medipix technology in dosimetry. In 2019, for example, Timepix3 was employed to determine the exposure of medical personnel to ionising radiation in an interventional radiology theatre at Christchurch Hospital in New Zealand. The chip was able to map the radiation fluence and energy spectrum of the scattered photon field that reaches the practitioners, and can also provide information about which parts of the body are most exposed to radiation.

Meanwhile, “GEMPix” detectors are being evaluated for use in quality assurance in hadron therapy. GEMPix couples gas electron multipliers (GEMs) – a type of gaseous ionisation detector developed at CERN – with the Medipix integrated circuit as readout to provide a hybrid device capable of detecting all types of radiation with a high spatial resolution. Following initial results from tests on a carbon-ion beam performed at the National Centre for Oncological Hadrontherapy (CNAO) in Pavia, Italy, a large-area GEMPix detector with an innovative optical read-out is now being developed at CERN in collaboration with the Holst Centre in the Netherlands. A version of the GEMPix called GEMTEQ is also currently under development at CERN for use in “microdosimetry”, which studies the temporal and spatial distributions of absorbed energy in biological matter to improve the safety and effectiveness of cancer treatments.

Knowledge transfer at CERN

GEMPix detectors

As a publicly funded laboratory, CERN has a remit, in addition to its core mission to perform fundamental research in particle physics, to expand the opportunities for its technology and expertise to deliver tangible benefits to society. The CERN Knowledge Transfer group strives to maximise the impact of CERN technologies and know-how on society in many ways, including through the establishment of partnerships with clinical, industrial and academic actors, support to budding entrepreneurs and seed funding to CERN personnel.

Supporting the knowledge-transfer process from particle physics to medical research and the med-tech industry is a promising avenue to boost healthcare innovation and provide solutions to present and future health challenges. CERN has provided a framework for the application of its technologies to the medical domain through a dedicated strategy document approved by its Council in June 2017. CERN will continue its efforts to maximise the impact of the laboratory’s know-how and technologies on the medical sector.

Two further dosimetry applications illustrate how technologies developed for CERN’s needs have expanded into commercial medical applications. The B-RAD, a hand-held radiation survey meter designed to operate in strong magnetic fields, was developed by CERN in collaboration with the Polytechnic of Milan and is now available off-the-shelf from an Italian company. Originally conceived for radiation surveys around the LHC experiments and inside ATLAS with the magnetic field on, it has found applications in several other tasks, such as radiation measurements on permanent magnets, radiation surveys at PET-MRI scanners and at MRI-guided radiation therapy linacs. Meanwhile, the radon dose monitor (RaDoM) tackles exposure to radon, a natural radioactive gas that is the second leading cause of lung cancer after smoking. The RaDoM device directly estimates the dose by reproducing the energy deposition inside the lung instead of deriving the dose from a measurement of radon concentration in air; CERN also developed a cloud-based service to collect and analyse the data, to control the measurements and to drive mitigation measures based on real time data. The technology is licensed to the CERN spin-off BAQ. 

Cancer treatments

Having surveyed the medical applications of particle detectors, we turn to the technology driving the beams themselves. Radiotherapy is a mainstay of cancer treatment, using ionising radiation to damage the DNA of cancer cells. In most cases, a particle accelerator is used to generate a therapeutic beam. Conventional radiation therapy uses X-rays generated by a linac, and is widely available at relatively low cost.

Medipix and Timepix read-out chips have become one of CERN’s most successful technology-transfer cases

Radiotherapy with protons was first proposed by Fermilab’s founding director Robert Wilson in 1946 while he was at Berkeley, and interest in the use of heavier ions such as carbon arose soon after. While X-rays lose energy roughly exponentially as they penetrate tissue, protons and other ions deposit almost all of their energy in a sharp “Bragg” peak at the very end of their path, enabling the dose to be delivered on the tumour target, while sparing the surrounding healthy tissues. Carbon ions have the additional advantage of a higher radiobiological effectiveness, and can control tumours that are radio-resistant to X-rays and protons. Widespread adoption of hadron therapy is, however, limited by the cost and complexity of the required infrastructures, and by the need for more pre-clinical and clinical studies.

PIMMS and NIMMS

Between 1996 and 2000, under the impulsion of Ugo Amaldi, Meinhard Regler and Phil Bryant, CERN hosted the Proton-Ion Medical Machine Study (PIMMS). PIMMS produced and made publicly available an optimised design for a cancer-therapy synchrotron capable of using both protons and carbon ions. After further enhancement by Amaldi’s TERA foundation, and with seminal contributions from Italian research organisation INFN, the PIMMS concept evolved into the accelerator at the heart of the CNAO hadron therapy centre in Pavia. The MedAustron centre in Wiener Neustadt, Austria, was then based on the CNAO design. CERN continues to collaborate with CNAO and MedAustron by sharing its expertise in accelerator and magnet technologies. 

In the 2010s, CERN teams put to use the experience gained in the construction of Linac 4, which became the source of proton beams for the LHC in 2020, and developed an extremely compact high-frequency radio-frequency quadrupole (RFQ) to be used as injector for a new generation of high-frequency, compact linear accelerators for proton therapy. The RFQ accelerates the proton beam to 5 MeV after only 2 m, and operates at 750 MHz – almost double the frequency of conventional RFQs. A major advantage of using linacs for proton therapy is the possibility of changing the energy of the beam, and hence the depth of treatment in the body, from pulse to pulse by switching off some of the accelerating units. The RFQ technology was licensed to the CERN spin-off ADAM, now part of AVO (Advanced Oncotherapy), and is being used as an injector for a breakthrough linear proton therapy machine at the company’s UK assembly and testing centre at STFC’s Daresbury Laboratory. 

Simulation of a dendritic arbour

In 2019 CERN launched the Next Ion Medical Machine Study (NIMMS) to develop cutting-edge accelerator technologies for a new generation of compact and cost-effective ion-therapy facilities. The goal is to propel the use of ion therapy, given that proton installations are already commercially available and that only four ion centres exist in Europe, all based on bespoke solutions. 

NIMMS is organised along four different lines of activities. The first aims to reduce the footprint of facilities by developing new superconducting magnet designs with large apertures and curvatures, and for pulsed operation. The second is the design of a compact linear accelerator optimised for installation in hospitals, which includes an RFQ based on the design of the proton therapy RFQ, and a novel source for fully-stripped carbon ions. The third concerns two innovative gantry designs, with the aim of reducing the size, weight and complexity of the massive magnetic structures that allow the beam to reach the patient from different angles: the SIGRUM lightweight rotational gantry originally proposed by TERA, and the GaToroid gantry invented at CERN which eliminates the need to mechanically rotate the structure by using a toroidal magnet (see figure “GaToroid”). Finally, new high-current synchrotron designs will be developed to reduce the cost and footprint of facilities while reducing the treatment time compared to present European ion-therapy centres: these will include a superconducting and a room-temperature option, and advanced features such as multi-turn injection for 1010 particles per pulse, fast and slow extraction, and multiple ion operation. Through NIMMS, CERN is contributing to the efforts of a flourishing European community, and a number of collaborations have been already established.

Another recent example of frontier radiotherapy techniques is the collaboration with Switzerland’s Lausanne University Hospital (CHUV) to build a new cancer therapy facility that would deliver high doses of radiation from very-high-energy electrons (VHEE) in milliseconds instead of minutes. The goal here is to exploit the so-called FLASH effect, wherein radiation doses administered over short time periods appear to damage tumours more than healthy tissue, potentially minimising harmful side-effects. This pioneering installation will be based on the high-gradient accelerator technology developed for the proposed CLIC electron–positron collider. Various research teams have been performing their biomedical research related to VHEE and FLASH at the CERN Linear Electron Accelerator for Research (CLEAR), one of the few facilities available for characterising VHEE beams.

Radioisotopes

CERN’s accelerator technology is also deployed in a completely different way to produce innovative radioisotopes for medical research. In nuclear medicine, radioisotopes are used both for internal radiotherapy and for diagnosis of cancer and other diseases, and progress has always been connected to the availability of novel radioisotopes. Here, CERN has capitalised on the experience of its ISOLDE facility, which during the past 30 years has the proton beam from the CERN PS Booster to produce 1300 different isotopes from 73 chemical elements for research ranging from nuclear physics to the life sciences. A new facility, called ISOLDE-MEDICIS, is entirely dedicated to the production of unconventional radioisotopes with the right properties to enhance the precision of both patient imaging and treatment. In operation since late 2017, MEDICIS will expand the range of radioisotopes available for medical research – some of which can be produced only at CERN – and send them to partner hospitals and research centres for further studies. During its 2019 and 2020 harvesting campaigns, for example, MEDICIS demonstrated the capability of purifying isotopes such as 169Er or 153Sm to new purity grades, making them suitable for innovative treatments such as targeted radioimmunotherapy.

Data handling and simulations

The expertise of particle physicists in data handling and simulation tools are also increasingly finding applications in the biomedical field. The FLUKA and Geant4 simulation toolkits, for example, are being used in several applications, from detector modelling to treatment planning. Recently, CERN contributed its know-how in large-scale computing to the BioDynaMo collaboration, initiated by CERN openlab together with Newcastle University, which initially aimed to provide a standardised, high-performance and open-source platform to support complex biological simulations (see figure “Computational neuroscience”). By hiding its computational complexity, BioDynaMo allows researchers to easily create, run and visualise 3D agent-based simulations. It is already used by academia and industry to simulate cancer growth, accelerate drug discoveries and simulate how the SARS-CoV-2 virus spreads through the population, among other applications, and is now being extended beyond biological simulations to visualise the collective behaviour of groups in society. 

The expertise of particle physicists in data handling and simulation tools are increasingly finding applications in the biomedical field

Many more projects related to medical applications are in their initial phases. The breadth of knowledge and skills available at CERN was also evident during the COVID-19 pandemic when the laboratory contributed to the efforts of the particle-physics community in fields ranging from innovative ventilators to masks and shields, from data management tools to open-data repositories, and from a platform to model the concentration of viruses in enclosed spaces to epidemiologic studies and proximity-sensing devices, such as those developed by Terabee.

Fundamental research has a priceless goal: knowledge for the sake of knowledge. The theories of relativity and quantum mechanics were considered abstract and esoteric when they were developed; a century later, we owe to them the remarkable precision of GPS systems and the transistors that are the foundation of the electronics-based world we live in. Particle-physics research acts as a trailblazer for disruptive technologies in the fields of accelerators, detectors and computing. Even though their impact is often difficult to track as it is indirect and diffused over time, these technologies have already greatly contributed to the advances of modern medicine and will continue to do so

Charmed matter–antimatter flips clocked by LHCb

Bin Flip Method plot

The ability of certain neutral mesons to oscillate between their matter and antimatter states at distinctly unworldly rates is a spectacular feature of quantum mechanics. The phenomenon arises when the states are orthogonal combinations of narrowly split mass eigenstates that gain a relative phase as the wavefunction evolves, allowing quarks and antiquarks to be interchanged at a rate that depends on the mass difference. Forbidden at tree level, proceeding instead via loops, such fl avour-changing neutral-current processes offer a powerful test of the Standard Model and a sensitive probe of physics beyond it.

Only four known meson systems can oscillate

Predicted by Gell-Mann and Pais in the 1950s, only four known meson systems (those containing quarks from different generations) can oscillate. K0K0 oscillations were observed in 1955, B0B0 oscillations in 1986 at the ARGUS experiment at DESY, and Bs0Bs0 oscillations in 2006 by the CDF experiment at Fermilab. Following the first evidence of charmed-meson oscillations (D0D0) at Belle and BaBar in 2007, LHCb made the first single-experiment observation confirming the process in 2012. Being relatively slow (more than 100 times the average lifetime of a D0 meson), the full oscillation period cannot be observed. Instead, the collaboration looked for small changes in the flavour mixture of the D0 mesons as a function of the time at which they decay via the Kπ final state.

On 4 June, during the 10th International Workshop on CHARM Physics, the LHCb collaboration reported the first observation of the mass difference between the D0D0states, precisely determining the frequency of the oscillations. The value represents one of the smallest ever mass differences between two particles: 6.4 × 10–6 eV, corresponding to an oscillation rate of around 1.5 × 109 per second. Until now, the measured value of the mass-difference between the underlying D0 and D0 eigenstates was marginally compatible with zero. By establishing a non-zero value with high significance, the LHCb team was able to show that the data are consistent with the Standard Model, while significantly improving limits on mixing-induced CP violation in the charm sector.

“In the future we hope to discover time-dependent CP violation in the charm system, and the precision and luminosity expected from LHCb upgrades I and II may make this possible,” explains Nathan Jurik, a CERN fellow who worked on the analysis.

The latest measurements of neutral charm–meson oscillations follow hot on the heels of an updated LHCb measurement of the Bs0Bs0 oscillation frequency announced in April, based on the heavy and light strange-beauty-meson mass difference. The very high precision of the Bs0Bs0 measurement provides one of the strongest constraints on physics beyond the Standard Model. Using a large sample of Bs0 → Ds π+ decays, the new measurement improves upon the previous precision of the oscillation frequency by a factor of two: Δms = 17.7683 ± 0.0051 (stat) ± 0.0032 (sys) ps–1 which, when combined with previous LHCb measurements, gives a value of 17.7656 ± 0.0057 ps–1. This corresponds to an oscillation rate of around 3 × 1012 per second, the highest of all four meson systems.

Future Circular Collider: what, why and how?

By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

FCC Study

Three expert panellists will introduce the motivation for and status of the proposed Future Circular Collider at CERN, followed by a discussion and live questions from the audience, moderated by CERN Courier editor Matthew Chalmers.

» Accelerator physicist and FCC study leader Michael Benedikt (CERN/Vienna University of Technology) will report on the status and scope of the FCC Innovation Study, a European Union-funded project to assess the technical and financial feasibility of a 100 km electron-positron and proton-proton collider in the Geneva region.

» Experimental particle physicist Beate Heinemann (DESY/Albert-Ludwigs-Universität Freiburg) will explain how the Higgs boson opens a new window on fundamental physics, and why a post-LHC collider is essential to explore this and other hot topics such as flavour physics.

» Theoretical physicist Matthew McCullough (CERN) will explore the potential of a future circular collider to address the dark sector of the universe, and explain the importance of striving for the highest energies possible.

Want to learn more on this subject?

Michael Benedikt (left) completed his PhD on medical accelerators as a member of the CERN Proton-Ion Medical Machine Study group. He joined CERN’s accelerator operation group in 1997, where he headed different sections before becoming deputy group leader from 2006 to 2013. From 2008 to 2013, he was project leader for the accelerator complex for the MedAustron hadron therapy in Austria, and since 2013 he has led the Future Circular Collider Study at CERN.

Beate Heinemann (middle) completed her PhD at the University of Hamburg in 1999 in experimental particle physics at the HERA collider in Hamburg. She became a lecturer at the University of Liverpool in 2003, a professor at UC Berkeley in 2006 and a scientist at Lawrence Berkeley National Laboratory. She was deputy spokesperson of the ATLAS collaboration from 2013 to 2017, and since 2016 she is a leading scientist at DESY and W3 professor at Albert-Ludwigs-Universität Freiburg.

Matthew Mccullough (right) is a senior staff member in the CERN Theory Department. He completed his undergraduate and PhD degrees at the University of Oxford, followed by postdocs at MIT and CERN. His research interests cover physics beyond the Standard Model, from the origins of the Higgs boson to the nature of dark matter.


ALICE tracks new territory

ALICE ITS Inner Barrel installation

In the coming decade, the study of nucleus–nucleus, proton–nucleus and proton–proton collisions at the LHC will offer rich opportunities for a deeper exploration of the quark–gluon plasma (QGP). An expected 10-fold increase in the number of lead–lead (Pb–Pb) collisions should both increase the precision of measurements of known probes of the QGP medium as well as give access to new ones. By focusing on rare probes down to very low transverse momentum, such as heavy-flavour particles, quarkonium states, real and virtual photons, as well the study of jet quenching and exotic heavy nuclear states, very large data samples will be required.

To seize these opportunities, the ALICE collaboration has undertaken a major upgrade of its detectors to increase the event readout, online data processing and recording capabilities by nearly two orders of magnitude (CERN Courier January/February 2019 p25). This will allow Pb–Pb minimum-bias events to be recorded at rates in excess of 50 kHz, which is the expected Pb–Pb interaction rate at the LHC in Run 3, as well as proton–lead (p–Pb) and proton–proton (pp) collisions at rates of about 500 kHz and 1 MHz, respectively. In addition, the upgrade will improve the ability of the ALICE detector to distinguish secondary vertices of particle decays from the interaction vertex and to track very low transverse-momentum particles, allowing measurements of heavy-flavour hadrons and low-mass dileptons with unprecedented precision and down to zero transverse momentum.

High impact

These ambitious physics goals have motivated the development of an entirely new inner tracking system, ITS2. Starting from LHC Run 3 next year, the ITS2 will allow pp and Pb–Pb collisions to be read out 100 and 1000 times more quickly than was possible in previous runs, offering superior ability to measure particles at low transverse momenta (see “High impact” figure). Moreover, the inner three layers of the ITS2 feature a material budget three times lower than the original detector, which is also important for improving the tracking performance at low transverse momentum.

With its 10 m2 of active silicon area and nearly 13 billion pixels, the ITS2 is the largest pixel detector ever built. It is also the first detector at the LHC to use monolithic active pixel sensors (MAPS), instead of the more conventional and well-established hybrid pixels and silicon microstrips.

Change of scale

The particle sensors and the associated read-out electronics used for vertexing and tracking detection systems in particle-physics experiments have very demanding requirements in terms of granularity, material thickness, readout speed and radiation hardness. The development of sensors based on silicon-semiconductor technology and read-out integrated circuits based on CMOS technology revolutionised the implementation of such detection systems. The development of silicon microstrips, already successfully used at the Large Electron-Positron (LEP) collider, and, later, the development of hybrid pixel detectors, enabled the construction of tracking and vertexing detectors that meet the extreme requirements – in terms of particle rates and radiation hardness – set by the LHC. As a result, silicon microstrip and pixel sensors are at the heart of the particle-tracking systems in most particle-physics experiments today.

Nevertheless, compromises exist in the implementation of this technology. Perhaps the most significant is the interface between the sensor and the readout electronics, which are typically separate components. To go beyond these limitations and construct detection systems with higher granularity and less material thickness requires the development of new technology. The optimal way to achieve this is to integrate both sensor and readout electronics to create a single detection device. This is the approach taken with CMOS active pixel sensors (APSs). Over the past 20 years, extensive R&D has been carried out on CMOS APSs, making this a viable option for vertexing and tracking detection systems in particle and nuclear physics, although their performance in terms of radiation hardness is not yet at the level of hybrid pixel detectors.

ALPIDE, which is the result of an intensive R&D effort, is the building block of the ALICE ITS2

The first large-scale application of CMOS APS technology in a collider experiment was the STAR PXL detector at Brookhaven’s Relativistic Heavy-Ion Collider in 2014 (CERN Courier October 2015 p6). The ALICE ITS2 has benefitted from significant R&D since then, in particular concerning the development of a more advanced CMOS imaging sensor, named ALPIDE, with a minimum feature size of 180 nm. This has led to a significant improvement in the field of MAPS for single-particle detection, reaching unprecedented performance in terms of signal/noise ratio, spatial resolution, material budget and readout speed.

ALPIDE sensors

ALPIDE, which is the result of an intensive R&D effort carried out by ALICE over the past eight years, is the building block of the ALICE ITS2. The chip is 15 × 30 mm2 in area and contains more than half a million pixels organised in 1024 columns and 512 rows. Its very low power consumption (< 40 mW/cm2) and excellent spatial resolution (~5 μm) are perfect for the inner tracker of ALICE.

ALPIDE journeys

In ALPIDE the sensitive volume is a 25 μm-thick layer of high-resistivity p-type silicon (> 1 kΩ cm) grown epitaxially on top of a standard (low-resistivity) CMOS wafer (see “ALPIDE journeys” figure). The electric charge generated by particles traversing the sensitive volume is collected by an array of n–p diodes reverse-biased with a positive potential (~1 V) applied on the n-well electrode and a negative potential (down to a minimum of –6 V) applied to the substrate (backside). The possibility of varying the reverse-bias voltage in the range 1 to 7 V allows control over the size of the depleted volume (the fraction of the sensitive volume where the charge is collected by drift due to the presence of an electric field) and, correspondingly, the charge-collection time. Measurements carried out on sensors with characteristics identical to ALPIDE have shown an average charge-collection time consistently below 15 ns for a typical reverse-bias voltage of 4 V. Applying reverse substrate bias to the ALPIDE sensor also increases the tolerance to non-ionising energy loss to well beyond 1013 1 MeV neq/cm2, which is largely sufficient to meet ALICE’s requirements.

Another important feature of ALPIDE is the use of a p-well to shield the full CMOS circuitry from the epitaxial layer. Only the n-well collection electrode is not shielded. The deep p-well prevents all other n-wells – which contain circuitry – from collecting signal charge from the epitaxial layer, and therefore allows the use of full CMOS and consequently more complex readout circuitry in the pixel. ALICE is the first experiment where this has been used to implement a MAPS with a pixel front-end (amplifier and discriminator) and a sparsified readout within the pixel matrix similar to hybrid sensors. The low capacitance of the small collection electrode (about 2 × 2 μm2), combined with a circuit that performs sparsified readout within the matrix without a free-running clock, keeps the power consumption as low as 40 nW per pixel.

Cylindrical structure

ITS2 structure

The ITS2 consists of seven layers covering a radial extension from 22 to 430 mm with respect to the beamline (see “Cylindrical structure” figure). The innermost three layers form the inner barrel (IB), while the middle two and the outermost two layers form the outer barrel (OB). The radial position of each layer was optimised to achieve the best combined performance in terms of pointing resolution, momentum resolution and tracking efficiency in the expected high track-density environment of a Pb–Pb collision. It covers a pseudo-rapidity range |η| < 1.22 for 90% of the most luminous beam interaction region, extending over a total surface of 10 m2 and containing about 12.5 Gpixels with binary readout, and is operated at room temperature using water cooling.

ALICE ITS

Given the small size of the ALPIDE (4.5 cm2), sensors are tiled-up to form the basic detector unit, which is called a stave. It consists of a “space-frame” (a carbon-fibre mechanical support), a “cold plate” (a carbon ply embedding two cooling pipes) and a hybrid integrated circuit (HIC) assembly in which the ALPIDE chips are glued and electrically connected to a flexible printed circuit. An IB HIC and an OB HIC include one row of nine chips and two rows of seven chips, respectively. The HICs are glued to the mechanical support: 1 HIC for the IB and 8 or 14 HICs for the two innermost and two outermost layers of the OB, respectively (see “State of the art” figure).

Zero-suppressed hit data are transmitted from the staves to a system of about 200 readout boards located 7 m away from the detector. Data is transmitted serially with a bit-rate up to 1.2 Gb/s over more than 3800 twin-axial cables reaching an aggregate bandwidth of about 2 Tb/s. The readout boards aggregate data and re-transmit it over 768 optical-fibre links to the first-level processors of the combined online/offline (O2) computing farm. The data are then sequenced in frames, each containing the hit information of the collisions occurring in contiguous time intervals of constant duration, typically 22 μs.

The process and procedures to build the HICs and staves are rather complex and time-intensive. More than 10 construction sites distributed worldwide worked together to develop the assembly procedure and to build the components. More than 120 IB and 2500 OB HICs were built using a custom-made automatic module-assembly machine, implementing electrical testing, dimension measurement, integrity inspection and alignment for assembly. A total of 96 IB staves, enough to build two copies of the three IB layers, and a total of 160 OB staves, including 20% spares, have been assembled.

A large cleanroom was built at CERN for the full detector assembly and commissioning activities. Here the same backend system that will be used in the experiment was installed, including the powering system, cooling system, full readout and trigger chains. Staves were installed on the mechanical support structures to form layers, and the layers are assembled in half-barrels, IB (layers 0, 1 and 2) top and bottom and OB (layers 3, 4, 5 and 6) top and bottom. Each stave is then connected to power-supply and readout systems. The commissioning campaign started in May 2019 to fully characterise and calibrate all the detector components, and installations of both the OB and IB were completed in May this year.

Physics ahead

After nearly 10 years of R&D, the upgrade of the ALICE experimental apparatus – which includes an upgraded time projection chamber, a new muon forward tracker, a new fast-interaction trigger detector, forward diffraction detector, new readout electronics and an integrated online–offline computing system – is close to completion. Most of the new or upgraded detectors, including the ITS2, have already been installed in the experimental area and the global commissioning of the whole apparatus will be completed this year, well before the start of Run 3, which is scheduled for the spring of 2022.

The significant enhancements to the performance of the ALICE detector will enable the exploration of new phenomena

The significant enhancements to the performance of the ALICE detector will enable detailed, quantitative characterisation of the high-density, high-temperature phase of strongly interacting matter, together with the exploration of new phenomena. The ITS2 is at the core of this programme. With improved pointing resolution and tracking efficiency at low transverse momentum, it will enable the determination of the total production cross-section of the charm quark. This is fundamental for understanding the interplay between the production of charm quarks in the initial hard scattering, their energy loss in the QGP and possible in-medium thermal production. Moreover, the ITS2 will also make it possible to measure a larger number of different charmed and beauty hadrons, including baryons, opening the possibility for determining the heavy-flavour transport coefficients. A third area where the new ITS will have a major impact is the measurement of electron–positron pairs emitted as thermal radiation during all stages of the heavy-ion collision, which offer an insight into the bulk properties and space–time evolution of the QGP.

More in store

The full potential of the ALPIDE chip underpinning the ITS2 is yet to be fully exploited. For example, a variant of ALPIDE explored by ALICE based on an additional low-dose deep n-type implant to realise a planar junction in the epitaxial layer below the wells containing the CMOS circuitry results in a much faster charge collection and significantly improved radiation hardness, paving the way for sensors that are much more resistant to radiation.

Into the future

Further improvements to MAPS for high-energy physics detectors could come by exploiting the rapid progress in imaging for consumer applications. One of the features offered recently by CMOS imaging sensor technologies, called stitching, will enable a new generation of MAPS with an area up to the full wafer size. Moreover, the reduction in the sensor thickness to about 30–40 μm opens the door to large-area curved sensors, making it possible to build a cylindrical layer of silicon-only sensors with a further significant reduction in the material thickness. The ALICE collaboration is already preparing a new detector based on these concepts, which consists of three cylindrical layers based on curved wafer-scale stitched sensors (see “Into the future” figure). This new vertex detector will be installed during Long Shutdown 3 towards the middle of the decade, replacing the three innermost layers of the ITS2. With the first detection layer closer to the interaction point (from 23 to 18 mm) and a reduction in the material budget close to the interaction point by a factor of six, the new vertex detector will further improve the tracking precision and efficiency at low transverse momentum.

The technologies developed by ALICE for the ITS2 detector are now being used or considered for several other applications in high-energy physics, including the vertex detector of the sPHENIX experiment at RHIC, and the inner tracking system for the NICA MPD experiment at JINR. The technology is also being applied to areas outside of the field, including in medical and space applications. The Bergen pCT collaboration and INFN Padova’s iMPACT project, for example, are developing novel ALPIDE-based devices for clinical particle therapy to reconstruct 3D human body images. The HEPD02 detector for the Chinese–Italian CSES-02 mission, meanwhile, includes a charged-particle tracker made of three layers of ALPIDE sensors that represents a pioneering test for next-generation space missions. Driven by a desire to learn more about the fundamental laws of nature, it is clear that advanced silicon-tracker technology continues to make an impact on wider society, too.

From CERN to the environment

Daphne Technology

CERN technologies and personnel make it a hub for so much more than exploring the fundamental laws of the universe. In an event organised by the CERN Alumni Relations team on 30 April, five CERN alumni who now work in the environmental industry discussed how their high-energy physics training helped them to get to where they are today.

One panellist, Zofia Rudjor, used to work on the ATLAS trigger system and the measurement of the Higgs-boson decays to tau leptons. Having spent 10 years at CERN, and with the discovery of the Higgs still fresh in the memory, she now works as a data scientist for the Norwegian Institute for Water Research (NIVA). “For my current role, a lot of the skills that I acquired at CERN, from solving complex problems to working with real-time data streams, turned out to be very key and useful,” she said at the virtual April event. Similar sentiments were shared by fellow panelist Manel Sanmarti, a former cryogenic engineer who is now the co-founder of Bamboo Energy Platform: “CERN is kind of the backbone of my career – it’s really excellent. I would say it’s the ‘Champions League’ of technology!”

However, much learning and preparation is also required to transition from particle physics to the environment. Charlie Cook began his career as an engineer at CERN and is now the founder of Rightcharge, a company which helps electric car drivers reduce the cost of charging and to use cleaner energy sources. Before taking the plunge into the environmental industry, he first completed a course at Imperial College Business School on climate-change management and finance, which helped him “learn the lingo” in the finance world. A stint at Octopus Electric Vehicles was followed by driving a domestic vehicle-to-grid demonstration project called Powerloop which launched at the beginning of 2018. “Sometimes it’s too easy to start talking in abstract terms about sustainability, but, to really understand things I like to see the numbers behind everything,” he said.

Everything that is happening in the environmental field today is all because of policymakers

Mario Michan, CEO of Daphne Technology (a company focused on enabling industries to decarbonise), and a former investigator of antihydrogen at CERN’s Antiproton Decelerator, also stressed the importance of being familiar with how the sector works, pointing out the large role that policymakers take in the field: “Everything that is happening in the environmental field today is all because of policymakers,” he remarked.

Another particle physicist who made the change is Giorgio Cortiana, who now works at E.ON’s global advanced analytics and artificial intelligence leading several data-science projects. His scientific background in complex physics data analysis, statistics, machine learning and object-oriented programming is ideal for extracting meaningful insights from large datasets, and for coping with everyday problems that need quick and effective solutions, he explained, noting the different mentality from academia. “At CERN you have the luxury to really focus on your research, down to the tiny details — now, I have to be a bit more pragmatic,” he said. “Here [at E.ON] we are instead looking to try and make an impact as soon as we can.

Leaving the field
The decision to leave the familiar surroundings of high-energy physics requires perseverance, stressed Rudjor, stating that it is important to pick up the phone to find out what type of position is really being offered. Other panelists also noted that it is vital to spend some time to look at what skills you can bring for a specific posting. “I think there are many workplaces which don’t really know how to recruit people with our skills – they would like the people, but they typically don’t open positions because they don’t know exactly how to specify the job.”

The CERN Alumni Network’s “Moving Out of Academia” events provide a rich source of candid advice for those seeking to make the change, while also demonstrating the impact of high-energy physics in broader society. The latest environment-industry events follow others dedicated to careers in finance, industrial engineering, big data, entrepreneurship and medical technologies. More are in store, explains head of CERN Alumni Relations, Rachel Bray. “One of our goals is to support those in their early careers – if and when they decide to leave academia for another sector. In addition to the Moving out of Academia events, we have recently launched a new series which brings together early-career scientists and the companies seeking the talents and skills developed at CERN.”

Collider neutrinos on the horizon

FASERv pilot-detector event displays

Think “neutrino detector” and images of giant installations come to mind, necessary to compensate for the vanishingly small interaction probability of neutrinos with matter. The extreme luminosity of proton-proton collisions at the LHC, however, produces a large neutrino flux in the forward direction, with energies leading to cross-sections high enough for neutrinos to be detected using a much more compact apparatus.

In March, the CERN research board approved the Scattering and Neutrino Detector (SND@LHC) for installation in an unused tunnel that links the LHC to the SPS, 480 m downstream from the ATLAS experiment. Designed to detect neutrinos produced in a hitherto unexplored pseudo-rapidity range (7.2 < ? < 8.6), the experiment will complement and extend the physics reach of the other LHC experiments — in particular FASERν, which was approved last year. Construction of FASERν, which is located in an unused service tunnel on the opposite side of ATLAS along the LHC beamline (covering |?|>9.1), was completed in March, while installation of SND@LHC is about to begin.

Both experiments will be able to detect neutrinos of all types, with SND@LHC positioned off the beamline to detect neutrinos produced at slightly larger angles. Expected to commence data-taking during LHC Run 3 in spring 2022, these latest additions to the LHC-experiment family are poised to make the first observations of collider neutrinos while opening new searches for feebly interacting particles and other new physics.

Neutrinos galore
SND@LHC will comprise 800 kg of tungsten plates interleaved with emulsion films and electronic tracker planes based on scintillating fibres. The emulsion acts as vertex detector with micron resolution while the tracker provides a time stamp, the two subdetectors acting as a sampling electromagnetic calorimeter. The target volume will be immediately followed by planes of scintillating bars interleaved with iron blocks serving as a hadron calorimeter, followed downstream by a muon-identification system.

SND layout

During its first phase of operation, SND@LHC is expected to collect an integrated luminosity of 150 fb-1, corresponding to more than 1000 high-energy neutrino interactions. Since electron neutrinos and antineutrinos are predominantly produced by charmed-hadron decays in the pseudorapidity range explored, the experiment will enable the gluon parton-density function to be constrained in an unexplored region of very small x. With projected statistical and systematic uncertainties of 30% and 22% in the ratio between ?e and ??, and about 10% for both uncertainties in the ratio between ?and ?? at high energies, the Run-3 data will also provide unique tests of lepton flavour universality with neutrinos, and have sensitivity in the search for feebly interacting particles via scattering signatures in the detector target.

“The angular range that SND@LHC will cover is currently unexplored,” says SND@LHC spokesperson Giovanni De Lellis. “And because a large fraction of the neutrinos produced in this range come from the decays of particles made of heavy quarks, these neutrinos can be used to study heavy-quark particle production in an angular range that the other LHC experiments can’t access. These measurements are also relevant for the prediction of very high-energy neutrinos produced in cosmic-ray interactions, so the experiment is also acting as a bridge between accelerator and astroparticle physics.”

A FASER first
FASERν is an addition to the Forward Search Experiment (FASER), which was approved in March 2019 to search for light and weakly interacting long-lived particles at solid angles beyond the reach of conventional collider detectors. Comprising a small and inexpensive stack of emulsion films and tungsten plates measuring 0.25 x 0.25 x 1.35 m and weighing 1.2 tonnes, FASERν is already undergoing tests. Smaller than SND, the detector is positioned on the beam-collision axis to maximise the neutrino flux, and should detect a total of around 20,000 muon neutrinos, 1300 electron neutrinos and 20 tau neutrinos in an unexplored energy regime at the TeV scale. This will allow measurements of the interaction cross-sections of all neutrino flavours, provide constraints on non-standard neutrino interactions, and improve measurements of proton parton-density functions in certain phase-space regions.

The final detector should do much better — it will be a hundred times bigger

Jamie Boyd

In May, based on an analysis of pilot emulsion data taken in 2018 using a target mass of just 10 kg, the FASERν team reported the detection of the first neutrino-interaction candidates, based on a measured 2.7σ excess of a neutrino-like signal above muon-induced backgrounds. The result paves the way for high-energy neutrino measurements at the LHC and future colliders, explains FASER co-spokesperson Jamie Boyd: “The final detector should do much better — it will be a hundred times bigger, be exposed to much more luminosity, have muon identification capability, and be able to link observed neutrino interactions in the emulsion to the FASER spectrometer. It is quite impressive that such a small and simple detector can detect neutrinos given that usual neutrino detectors have masses measured in kilotons.”

Felix H Boehm 1924–2021

Felix Boehm

Felix H Boehm, who was William L Valentine Professor of Physics at Caltech, passed away on 25 May in his Altadena home. He was a pioneer in the study of fundamental laws in nuclear- physics experiments. 

Born in Basel, Switzerland, in 1924, Felix studied physics at ETH Zürich, earning a diploma in 1948 and a PhD in 1951 for a measurement of the (p,n) reaction at the ETH cyclotron. In 1951 he moved to the US and joined the group of Chien-Shiung Wu at Columbia University, which was investigating beta decay. He joined Caltech in 1953 and spent the rest of his academic career there. 

Felix worked first with Jesse DuMond, who had developed the bent-crystal spectrometer, an instrument with unrivalled energy resolution in gamma-ray spectrometry. He used it to determine nuclear radii by measuring X-ray isotope shifts in atoms. Later, he installed such devices at LAMPF, SREL and PSI to investigate pionic atoms, which led to a precise determination of the strong-interaction shift in pionic hydrogen. At Caltech, he also became interested in parity violation and time-reversal invariance. In 1957, in an experiment performed with Aaldert Wapstra, he demonstrated that electrons in beta decay have a predominantly negative helicity. 

In the mid 1970s, discussions with Harald Fritzsch and Peter Minkowski convinced Felix that the study of neutrino masses and mixings might provide answers to fundamental questions. From then on, long before it was fashionable, it became his main field of activity. He first looked at neutrino oscillations and initiated an electron–neutrino disappearance experiment with Rudolf Mössbauer. 

Theirs was the first dedicated search for neutrino oscillations, beginning with a short-baseline phase at the ILL reactor in Grenoble. The concept of the experiment was presented at the Neutrino 79 conference in Bergen, at which the Gargamelle collaboration also reported limits on νμ ↔ νe oscillations. Both talks were relegated to a session on exotic phenomena. The ILL experiment was continued at the Gösgen reactor in Switzerland with a longer baseline. No evidence of oscillations was found and stringent limits in a given parameter space were derived, contradicting positive claims made by others. A larger detector was later built at the Palo Verde nuclear power station in Arizona, where again no evidence for oscillations was found. A logical continuation of the effort initiated by Felix was the KamLAND experiment in Japan, which was exposed to several reactors and eventually, in 2002, observed neutrino oscillations in the disappearance mode at a still-longer baseline. 

In parallel, Felix decided to probe neutrino masses by searching for neutrinoless double- beta decay. He led a small collaboration that installed a germanium detector in the Gotthard underground laboratory in Switzerland to probe 76Ge, and then searched for the process using a time-projection chamber (TPC) filled with xenon enriched with 136Xe. The TPC, a novel device at the time, improved the event signature and thus reduced the background, allowing stringent constraints to be placed on the effective neutrino mass. The ongoing EXO experiment can be seen as a continuation of this programme, vastly improving the sensitivity in its first phase (EXO-200 at WIPP, New Mexico) and expected to do even better in the second phase, nEXO.

Felix Boehm had a talent to identify important issues on the theoretical side, and to select the appropriate technical methods on the experimental side. He was always ready to innovate. In particular, he realised very early on the importance of selecting radio-pure materials in low-count-rate, low-background experiments. All those who worked with him appreciated his open mind, his determination, his great culture and his kindness.

European Physical Society announces 2021 awards

Borexino

The high-energy and particle physics division of the European Physical Society (EPS-HEPP) has announced the recipients of its 2021 prizes. The five awards will be presented during the EPS-HEP Conference on 26 July, which will take place online.

Byran and Torbjorn

2021 EPS High Energy and Particle Physics Prize
Torbjrn Sjöstrand (Lund University) and Bryan Webber (Cambridge University) have been announced as the winners of the the 2021 EPS-HEPP Prize for “for the conception, development and realisation of parton shower Monte Carlo simulations, yielding an accurate description of particle collisions in terms of quantum chromodynamics and electroweak interactions, and thereby enabling the experimental validation of the Standard Model, particle discoveries and searches for new physics.” Both Sjöstrand and Webber were also warded the 2012 Sakurai Prize for Theoretical Particle Physics by the American Physical Society, along with the late Guido Altarelli.

2021 Giuseppe and Vanna Cocconi Prize
The 2021 Giuseppe and Vanna Cocconi Prize has been awarded to the Borexino Collaboration “for their ground-breaking observation of solar neutrinos from the pp and CNO chains that provided unique and comprehensive tests of the Sun as a nuclear fusion engine.” Gianpaolo Bellini, a former spokesperson of try experiment commented: “The Cocconi prize awarded to us by EPS is the recognition of a more than 30-year history that began in the late 1980s, when the experiment was conceived in the context of the scientific debate triggered by the then unsolved problem of the solar neutrino, and by the need for studying solar neutrinos from very low energies.”

Bernhard Mistlberger

2021 Gribov Medal
Bernhard Mistlberger (SLAC) has received the 2021 Gribov Medal “for his ground-breaking contributions to multi-loop computations in QCD and to high-precision predictions of Higgs and vector boson production at hadron colliders.” Mistlberger also recently won the $5000 Wu-Ki Tung Award for Early-Career Research on QCD for his work.

Ben Nachman and Nathan Jurik

 

2021 Young Experimental Physicist Prize
The 2021 Outreach Prize of the High Energy and Particle Physics Division of the EPS has been awarded to Nathan Jurik (CERN) “for his outstanding contributions to the LHCb experiment, including the discovery of pentaquarks, and the measurements of CP violation and mixing in the B and D meson systems”; and to Ben Nachman (LBNL Berkeley) “for exceptional contributions to the study of QCD jets as a probe of QCD dynamics and as a tool for new physics searches, his innovative application of machine learning for characterising jets, and the development of novel strategies on jet reconstruction and calibration at the ATLAS experiment.”

Outreach winners

2021 Outreach Prize
The three winners of the 2021 EPS-HEPP Outreach Prize are: Uta Bilow (TU Dresden) and Kenneth Cecire (University of Notre Dame), “for the long-term coordination and major expansion of the International Particle Physics Master Classes to include a range of modern methods and exercises, and connecting scientists from all the major LHC and Fermilab experiments to school pupils across the world”, and Sascha Mehlhase (LMU München) “for the design and creation of the ATLAS detector and other interlocking-brick models, creating an international outreach program that reaches to an unusually young audience.” After building the ATLAS detector out of 9500 Lego pieces in 2011, Mehlhase set up the popular “Build Your Own Particle Detector” programme.

Greening gaseous detectors

Thanks to their large volumes and cost effectiveness, particle-physics experiments rely heavily on gaseous detectors. Unfortunately, environmentally harmful chlorofluorocarbons known as freons play an important role in traditional gas mixtures. To address this issue, more than 200 gas-detector experts participated in a workshop hosted online by CERN on 22 April to study the operational behaviour of novel gases and alternative gas mixtures.

Large gas molecules absorb energy in vibrational and rotational modes of excitation

Freon-based gases are essential to many detectors currently used at CERN, especially for tracking and triggering. Examples run from muon systems, ring-imaging Cherenkov (RICH) detectors and time-projection chambers (TPCs) to wire chambers, resistive-plate chambers (RPCs) and micro-pattern gas detectors (MPGDs). While the primary gas in the mixture is typically a noble gas, adding a “quencher” gas helps achieve a stable gas gain, well separated from the noise of the electronics. Large gas molecules such as freons absorb energy in relevant vibrational and rotational modes of excitation, thereby preventing secondary effects such as photon feedback and field emission. Extensive R&D is needed to reach the stringent performance required of each gas mixture.

The CMS muon system

CERN has developed several strategies to reduce greenhouse gas (GHG) emissions from particle detectors. As demonstrated by the ALICE experiment’s TPC, upgrading gas-recirculation systems can reduce GHGs by almost 100%. When it is not possible to recirculate all of the gas mixture, gas recuperation is an option – for example, the recuperation of CF4 by the CMS experiment’s cathode-stripchamber (CSC) muon detector and the LHCb experiment’s RICH-2 detector. A complex gas-recuperation system for the C2H2F4 (R134a) in RPC detectors is also under study, and physicists are exploring the use of commonplace gases. In the future, new silicon photomultipliers could reduce chromatic error and increase photon yield, potentially allowing CF4 to be replaced with CO2. Meanwhile, in LHCb’s RICH-1 detector, C4F10 could possibly be replaced with hydrocarbons like C4H10 if the flammability risk is addressed.

Eco-gases

Finally, alternative “eco-gases” are the subject of intense R&D. Eco-gases have a low global-warming potential because of their very limited stability in the atmosphere as they react with water or decompose in ultraviolet light. Unfortunately, these conditions are also present in gaseous detectors, potentially leading to detector aging. In addition to their stability, there is also the challenge of adapting current LHC detectors, given that access is difficult and many components cannot be replaced.

Roberto Guida (CERN), Davide Piccolo (Frascati), Rob Veenhof (Uludağ University) and Piet Verwilligen (Bari) convened workshop sessions at the April event. Groups from Turin, Frascati, Rome, CERN and GSI presented results based on the new hydro-fluoro-olefin (HFO) mixture with the addition of neutral gases such as helium and CO2 as a way of lowering the high working-point voltage. Despite challenges related to the larger signal charge and streamer probability, encouraging results have been obtained in test beams in the presence of LHClike background gamma rays. CMS’s CSC detector is an interesting example where HFO could replace CF4. In this case, its decomposition could even be a positive factor, however further studies are needed.

We now need to create a compendium of simulations and measurements for “green” gases in a similar way to the concerted effort in the 1990s and 2000s that proved indispensable to the design of the LHC detectors. To this end, the INRS-hosted LXCAT database enables the sharing and evaluation of data to model non-equilibrium low-temperature plasmas. Users can upload data on electron- and ion-scattering cross sections and compare “swarm” parameters. The ETH (Zürich), Aachen and HZDR (Dresden) groups illustrated measurements of transport parameters, opening possibilities of collaboration, while the Bari group sought feedback and collaboration on a proposal to precisely measure transport parameters for green gases in MPGDs using electron and laser beams.

Obtaining funding for this work can be difficult due to a lack of expected technological breakthroughs in low-energy plasma physics

Future challenges will be significant. The volumes of detector systems for the High-Luminosity LHC and the proposed Future Circular Collider, for example, range from 10 to 100 m3, posing a significant environmental threat in the case of leaks. Furthermore, since 2014 an EU “F-gas” regulation has come into force, with the aim of reducing sales to one-fifth by 2030. Given the environmental impact and the uncertain availability and price of freon-based gases, preparing a mitigation plan for future experiments is of fundamental importance to the high-energy-physics community, and the next generation of detectors must be completely designed around eco-mixtures. Although obtaining funding for this work can be difficult, for example due to a lack of expected technological breakthroughs in low-energy plasma physics, the workshop showed that a vibrant cadre of physicists is committed to taking the field forward. The next workshop will take place in 2022.

Accelerators meet gravitational waves

Gravitational waves (GWs) crease and stretch the fabric of spacetime as they ripple out across the universe. As they pass through regions where beams circulate in storage rings, they should therefore cause charged-particle orbits to seem to contract, as they climb new peaks and plumb new troughs, with potentially observable effects.

SRGW2021

Proposals in this direction have appeared intermittently over the past 50 years, including during and after the construction of LEP and the LHC. Now that the existence of GWs has been established by the LIGO and VIRGO detectors, and as new, even larger storage rings are being proposed in Europe and China, this question has renewed relevance. We are on the cusp of the era of GW astronomy — a young and dynamic domain of research with much to discover, in which particle accelerators could conceivably play a major role.

From 2 February to 31 March this year, a topical virtual workshop titled “Storage Rings and Gravitational Waves” (SRGW2021) shone light on this tantalising possibility. Organised within the European Union’s Horizon 2020 ARIES project, the meeting brought together more than 100 accelerator experts, particle physicists and members of the gravitational-physics community to explore several intriguing proposals.

Theoretically subtle

GWs are extremely feebly interacting. The cooling and expanding universe should have become “transparent” to them early in its history, long before the timescales probed through other known phenomena. Detecting cosmological backgrounds of GWs would, therefore, provide us with a picture of the universe at earlier times that we can currently access, prior to photon decoupling and Big-Bang nucleosynthesis. It could also shed light on high-energy phenomena, such as high-temperature phase transitions, inflation and new heavy particles that cannot be directly produced in the laboratory.

Gravitational wave sources and sensitivities

In the opening session of the workshop, Jorge Cervantes (ININ Mexico) presented a vivid account of the history of GWs, revealing how subtle they are theoretically. It took about 40 years and a number of conflicting papers to definitively establish their existence. Bangalore S. Sathyaprakash (Penn State and Cardiff) reviewed the main expected sources of GWs: the gravitational collapse of binaries of compact objects such as black holes, neutron stars and white dwarfs; supernovae and other transient phenomena; spinning neutron stars; and stochastic backgrounds with either astrophysical or cosmological origins. The GW frequency range of interest extends from 0.1 nHz to 1 MHz (see figure “Sources and sensitivities”).

The frequency range of interest extends from 0.1 nHz to 1 MHz

Raffaele Flaminio (LAPP Annecy) reviewed the mindboggling precision of VIRGO and LIGO, which can measure motion 10,000 times smaller than the width of an atomic nucleus. Jörg Wenninger (CERN) reported the similarly impressive sensitivity of LEP and the LHC to small effects, such as tides and earthquakes on the other side of the planet. Famously, LEP’s beam-energy resolution was so precise that it detected a diurnal distortion of the 27 km ring at an amplitude of a single millimetre, and the LHC beam-position-monitor system can achieve measurement resolutions on the average circumference approaching the micrometre scale over time intervals of one hour. While impressive, given that these machines are designed with completely different goals in mind, it is still far from the precision achieved by LIGO and VIRGO. However, one can strongly enhance the sensitivity to GWs by exploiting resonant effects and the long distances travelled by the particles over their storage times. In one hour, protons at the LHC travel through the ring about 40 million times. In principle, the precision of modern accelerator optics could allow storage rings and accelerator technologies to cover a portion of the enormous GW frequency range of interest.

Resonant Responses

Since the invention of the synchrotron, storage rings have been afflicted by difficult-to-control resonance effects which degrade beam quality. When a new ring is commissioned, accelerator physicists work diligently to “tune” the machine’s parameters to avoid such effects. But could accelerator physicists turn the tables and seek to enhance these effects and observe resonances caused by the passage of GWs?

In accelerators and storage rings, charged particles are steered and focused in the two directions transverse to their motion by dipole, quadrupole and higher-order magnets — the “betatron motion” of the beam. The beam is also kept bunched in the longitudinal plane as a result of an energy-dependent path length and oscillating electric fields in radio-frequency (RF) cavities — the “synchrotron motion” of the beam. A gravitational wave can resonantly interact with either the transverse betatron motion of a stored beam, at a frequency of several kHz, or with the longitudinal synchrotron motion at a frequency of tens of hertz.

Antenna optics

Katsunobu Oide (KEK and CERN) discussed the transverse betatron resonances that a gravitational wave can excite for a beam circulating in a storage ring. Typical betatron frequencies for the LHC are a few kHz, offering potentially sensitivity to GWs with frequencies of a similar order of magnitude. Starting from a standard 30 km ring, Oide-san proposed special beam-optical insertions with a large beta function, which would serve as “GW antennas” to enhance the resonance strength, resulting in 37.5 km-long optics (see figure “Antenna optics”). Among several parameters, the sensitivity to GWs should depend on the size of the ring. Oide derived a special resonance condition of kGWR±2=Qx, with R the ring radius, kGW the GW wave number and Qx the horizontal betatron tune. 

Suvrat Rao (Hamburg University) presented an analysis of the longitudinal beam response of the LHC. An impinging GW affects the revolution period, in a similar way to the static gravitational gradient effect due to the presence of the Mont Blanc (which alters the revolution time at the level of 10-16 s) and the diurnal effect of the changing locations of sun and moon (10-18 s) — the latter effect being about six orders of magnitude smaller than the tidal effect on the ring circumference.

The longitudinal beam response to a GW should be enhanced for perturbations close to the synchrotron frequency, which, for the LHC, would be in the range 10 to 60 Hz. Raffaele D’Agnolo (IPhT) estimated the sensitivity to the gravitational strain, h, at the synchrotron frequency, without any backgrounds, as h~10-13, and listed three possible paths to further improve the sensitivity by several orders of magnitude. Rao also highlighted that storage-ring GW detection potentially allows for an earth-based GW observatory sensitive to millihertz GWs, which could complement space-based laser interferometers such as LISA, which is planned to be launched in 2034. This would improve the sky-localisation GW-source, which is useful for electromagnetic follow-up studies with astronomical telescopes.

Out of the ordinary

More exotic accelerators were also mooted. A “coasting-beam” experiment might have zero restoring voltage and no synchrotron oscillations. Cold “crystalline” beams of stable ordered 1D, 2D or 3D structures of ions could open up a whole new frequency spectrum, as the phonon spectrum which could be excited by a GW could easily extend up to the MHz range. Witek Krasny (LPNHE) suggested storing beams consisting of “in the LHC: decay times and transition rates could be modified by an incident GW. The stored particles could, for example, include the excited partially stripped heavy ions that are the basis of a “gamma factory”.

Finally on the storage-ring front, Andrey Ivanov (TU Vienna) and co-workers discussed the possibly shrinking circumference of a storage ring, such as the 1.4 km light source SPring-8 in Japan, under the influence of the relic GW background.

The Gertsenshtein effect

Delegates at SRGW2021 also proposed completely different ways of using accelerator technology to detect GWs. Sebastian Ellis (IPhT) explained how an SRF cavity might act as a resonant bar or serve as a Gertsenshtein converter, in both cases converting a graviton into a photon in the presence of a strong background magnetic field and yielding a direct electromagnetic signal — similar to axion searches. Related attempts at GW detection using cavities were pioneered in the 1970s by teams in the Soviet Union and Italy, but RF technology has made big strides in quality factors, cooling and insulation since then, and a new series of experiments appears to be well justified.

Another promising approach for GW detection is atomic-beam interferometry. Instead of light interference, as in LIGO and VIRGO, an incident GW would cause interference between carefully prepared beams of cold atoms. This approach is being pursued by the recently approved AION experiment using ultra-cold-strontium atomic clocks over increasingly large path lengths, including the possible use of an LHC access shaft to house a 100-metre device targeting the 0.01 to 1 Hz range. Meanwhile, a space-based version, AEDGE, could be realised with a pair of satellites in medium earth orbit separated by 4.4×107 m.

Storage rings as sources

Extraordinarily, storage rings could act not only as GW detectors, but also as observable sources of GWs. Pisin Chen (NTU Taiwan) discussed how relativistic charged particles executing circular orbital motion can emit gravitational waves in two channels: “gravitational synchrotron radiation” (GSR) emitted directly by the massive particle, and  “resonant conversion” in which, via the Gertsenshtein effect, electromagnetic synchrotron radiation (EMSR) is converted into GWs.

Gravitons could be emitted via “gravitational beamstrahlung”

John Jowett (GSI, retired from CERN) and Fritz Caspers (also retired from CERN) recalled that GSR from beams at the SPS and other colliders had been discussed at CERN as early  as the 1980s. It was realised that these beams would be among the most powerful terrestrial sources of gravitational radiation although the total radiated power would still be many orders of magnitude lower than from regular synchrotron radiation. The dominant frequency of direct GSR is the revolution frequency, 10 kHz, while the dominant frequency of resonant EMSR-GSR conversion is a factor γ3 higher, around 10 THz at the LHC, conceivably allowing the observation of gravitons. If all particles and bunches of a beam excited the GW coherently, the space-time metric perturbation has been estimated to be as large as hGSR~10-18. Gravitons could also be emitted via “gravitational beamstrahlung” during the collision with an opposing beam, perhaps producing the most prominent GW signal at future proposed lepton colliders. At the LHC, argued Caspers, such signals could be detected by a torsion-balance experiment with a very sensitive, resonant mechanical pickup installed close to the beam in one of the arcs. In a phase-lock mode of operation, an effective resolution bandwidth of millihertz or below could be possible, opening the exciting prospect of detecting synthetic sources of GWs.

Towards an accelerator roadmap

The concluding workshop discussion, moderated by John Ellis (King’s College London), focused on the GW-detection proposals considered closest to implementations: resonant betatron oscillations near 10 kHz; changes in the revolution period using “low-energy” coasting ion-beams without a longitudinally focusing RF system; “heterodyne” detection using SRF cavities up to 10 MHz; beam-generated GWs at the LHC; and atomic interferometry. These potential components of a future R&D plan cover significant regions of the enormous GW frequency space.

Apart from an informal meeting at CERN in the 1990s, SRGW2021 was the first workshop to link accelerators and GWs and bring together the implicated scientific communities. Lively discussions in this emerging field attest to the promise of employing accelerators in a completely different way to either detect or generate GWs. The subtleties of the particle dynamics when embedded in an oscillating fabric of space and time, and the inherent sensitivity problems in detecting GWs, pose exceptional challenges. The great interest prompted by SRGW2021, and the tantalising preliminary findings from this workshop, call for more thorough investigations into harnessing future storage rings and accelerator technologies for GW physics.

bright-rec iop pub iop-science physcis connect