Jean Sacton, who put Belgium at the forefront of major discoveries in fundamental physics and the development of associated technologies, died peacefully in his home in Brussels on 12 February, aged 86. He combined his scientific qualities with great human ones, as a firm boss but always present, attentive, warm and intentioned.
Jean Sacton defended his bachelor’s thesis on mesic atoms in nuclear emulsion at Université Libre de Bruxelles (ULB) in 1956, continuing there for his PhD. From 1960 to 1965, he surrounded himself with young researchers focusing on the properties of hyper-fragments produced by the interactions of K mesons in nuclear emulsions, which required significant human resources to scan the emulsion foils with microscopes. He defended his thesis in 1961 and, three years later as an associate lecturer, became head of the newly created department of elementary particle physics.
At the end of the 1960s, Sacton became professor and a member of various committees, including the management of the Belgian Interuniversity Laboratory for High Energies.
The foundation in 1972 of the Interuniversity Institute for High Energies (IIHE) was largely due to his efforts during the preceding decade. Co-directed for many years by its two founders (Sacton for ULB and Jacques Lemonne for Vrije Universiteit Brussel), IIHE has become the main centre for experimental research in particle physics in Belgium, and promotes close collaboration with other Belgian institutes.
In the 1970s the IIHE strongly contributed to the scanning and analysis of data from the giant bubble chambers GARGAMELLE and BEBC. In 1973 IIHE staff scanned one of the three events that spectacularly confirmed the existence of the weak neutral current, for which Sacton, together with the other members of the Gargamelle collaboration, received the European Physical Society’s High Energy and Particle Physics Prize in 2009. Other firsts that Sacton was involved in during the bubble-chamber era included the first direct observation of charged charmed particles in nuclear emulsions, and the measurement of the violation of scale invariance in deep-inelastic scattering.
Later, the IIHE, in collaboration with the University of Antwerpen and the University of Mons-Hainaut, contributed to the DELPHI experiment at LEP, for which they built the electronics for the muon chambers. The laboratory also engaged in the H1 collaboration at HERA, DESY. The Belgian contribution to H1 included the construction of two cylindrical multi-wire proportional chambers and associated data acquisition all of the detector’s multi-wire proportional chambers, during which Sacton continuously ensured that technical staff were retrained to keep up with the rapid pace of change.
At the same time, he became a member of the European Committee for Future Accelerators (as chair from 1984 to 1987), the CERN Super Proton Synchrotron Committee, the CERN Scientific Policy Committee, and the extended Scientific Council of DESY. While dean of the ULB sciences faculty from 1991–1995, he remained active as director of the laboratory, leaving to his teams the task of analysing DELPHI, H1 and CHORUS data, and preparing the IIHE contribution to the CMS experiment. In 1994 he became president of the particles and fields commission of the International Union for Pure and Applied Physics and a member of the International Committee for Future Accelerators, and from 1991–1994 chaired the High-Energy Physics Computer Coordinating Committee. He formally retired in 1999.
Jean Sacton lived a major scientific adventure starting from the discovery of the first mesons to the completion of the Standard Model. Through his quiet strength, professionalism, foresight and entrepreneurial spirit, he founded, developed and sponsored this field of research at ULB and made it shine far beyond.
On 22 December we lost our colleague and friend, a brilliant theoretical nuclear physicist, Vladimir Kukulin.
Vladimir Kukulin was born in Moscow in 1939. He graduated with honours from the Moscow Engineering Physics Institute in 1965, where he started his physics studies under the supervision of Arkady Migdal. Vladimir obtained his PhD in 1971 and his DSc in 1991. For more than 55 years, he worked in the Institute of Nuclear Physics at Moscow State University (MSU), becoming professor of theoretical physics in 1997 and head of the laboratory for atomic nucleus theory in 2012.
Vladimir had many close scientific relations, including the supervision of students’ work, at JINR (Dubna), KazNU (Almaty) and other leading physics institutes in Russia, Kazakhstan, Uzbekistan and Ukraine. He worked as a visiting professor and gave lectures at universities in the Czech Republic, Germany, the UK, Italy, Belgium, France, the US, Canada, Mexico, Japan and Australia, and since 1996 had maintained a scientific cooperation between MSU and the University of Tübingen.
Vladimir’s research interests embraced theoretical hadronic, nuclear and atomic physics, few-body physics, nuclear astrophysics, quantum scattering theory, mathematical and computational physics, among others. Many of the approaches he developed, such as the multi-cluster model of light nuclei, the method of orthogonalising pseudopotentials, and the stochastic variational method, opened new directions in nuclear physics and quantum theory of few- and many-body scattering. During the past two decades, Vladimir and his co-workers developed the effective wave-packet continuum discretisation approach for quantum scattering, and proposed a scheme for the ultra-fast quantum scattering calculations on a graphics processing unit.
A deep understanding of nuclear and mathematical physics allowed Vladimir to suggest, in 1998, a new mechanism for the short-range nucleon–nucleon (NN) interaction based on the formation of the intermediate six-quark bag dressed by meson clouds (the dressed dibaryon). He developed, with his colleagues from MSU and the University of Tübingen, the original dibaryon concept for the nuclear force, which received new experimental confirmation with the discovery of hexaquark states at COSY (Jülich) in 2011. More recently, Vladimir and his coauthors demonstrated the decisive role of dibaryon resonances in NN elastic scattering and NN-induced meson production at intermediate energies.
A combination of strong intuition, comprehensive knowledge, and experience in various fields of science and technology, enabled Vladimir to generate new ideas and carry out pioneering interdisciplinary research at the intersection of physics, mathematics, chemistry and engineering. He made an indispensable contribution to solving important applied problems, such as controlled thermonuclear fusion, cleaning of natural gas, fire-fighting and neutron-capture cancer therapy.
Vladimir was distinguished by non-standard thinking, humanity, a sparkling sense of humour and an inexhaustible love of life. His enthusiasm and intellectual freedom inspired several generations of his colleagues and students. We will always remember Vladimir as an outstanding scientist, a wise teacher and a good friend.
The original silicon pixel detector for CMS – comprising three barrel layers and two endcap disks – was designed for a maximum instantaneous luminosity of 1034 cm–2 s–1 and a maximum average pile-up of 25. Following LHC upgrades in 2013–2014, it was replaced with an upgraded system (the CMS Phase-1 pixel detector) in 2017 to cope with higher instantaneous luminosities. With a lower mass and an additional barrel layer and endcap disk, it was an evolutionary upgrade maintaining the well-tested key features of the original detector while enabling higher-rate capability, improved radiation tolerance and more robust tracking. During Long Shutdown 2, maintenance work on the Phase-1 device included the installation of a new innermost layer (see “Present and future” image) to enable the delivery of high-quality data until the end of LHC Run 3.
During the next long shutdown, scheduled for 2025, the entire tracker detector will be replaced in preparation for the High-Luminosity LHC (HL-LHC). This Phase-2 pixel detector will need to cope with a pile-up and hit rate eight times higher than before, and with a trigger rate and radiation dose 7.5 and 10 times higher, respectively. To meet these extreme requirements, the CMS collaboration, in partnership with ATLAS via the RD53 collaboration, is developing a next-generation hybrid-pixel chip utilising 65 nm CMOS technology. The overall system is much bigger than the Phase-1 device (~5 m2 compared to 1.75 m2) with vastly more read-out channels (~2 billion compared to 120 million). With six-times smaller pixels, increased detection coverage, reduced material budget, a new readout chip to enable a lower detection threshold, and a design that continues to allow easy installation and removal, the state-of-the-art Phase-2 pixel detector will serve CMS well into the HL-LHC era.
LHCb’s Vertex Locator (VELO) has played a pivotal role in the experiment’s flavour-physics programme. Contributing to triggering, tracking and vertexing, and with a geometry optimised for particles traveling close to the beam direction, its 46 orthogonal silicon-strip half-disks have enabled the collaboration to pursue major results. These include the 2019 discovery of CP violation in charm using the world’s largest reconstructed samples of charm decays, a host of matter–antimatter asymmetry measurements and rare-decay searches, and the recent hints of lepton non-universality in B decays.
Placing the sensors as close as possible to the primary proton–proton interactions requires the whole VELO system to sit inside the LHC vacuum pipe (separated from the primary vacuum by a 1.1 m-long thin-walled “RF foil”), and a mechanical system to move the disks out of harm’s way during the injection and stabilisation of the beams. After more than a decade of service witnessing the passage of some 1026 protons, the original VELO is now being replaced with a new one to prepare for a factor-five increase in luminosity for LHCb in LHC Run 3.
The entirety of the new VELO will be read out at a rate of 40 MHz, requiring a huge data bandwidth: up to 20 Gbits/s for the hottest ASICs, and 3 Tbit/s in total. Cooling using the minimum of material is another major challenge. The upgraded VELO will be kept at –20° via the novel technique of evaporative CO2 circulating in 120 × 200 µm channels within a silicon substrate (see “Fine structure” image, left). The harsh radiation environment also demands a special ASIC, the VeloPix, which has been developed with the CERN Medipix group and will allow the detector to operate a much more efficient trigger. To cope with increased occupancies at higher luminosity, the original silicon strips have been replaced with pixels. The new sensors (in the form of rectangles rather than disks) will be located even closer to the interaction point (5.1 mm versus the previous 8.2 mm for the first measured point), which requires the RF foil to sit just 3.5 mm from the beam and 0.9 mm from the sensors. The production of the foil was a huge technical achievement. It was machined from a solid-forged aluminium block with 98% of the material removed and the final shape machined to a thickness of 250 µm, with further chemical etching taking it to just 100 µm (see “Fine structure” image, right).
Around half of the VELO-module production is complete, with the work shared between labs in the UK and the Netherlands (see “In production” image). Assembly of the 52 modules into the “hood”, which provides cooling, services and vacuum, is now under way, with installation in LHCb scheduled to start in August. The VELO Upgrade I is expected to serve LHCb throughout Run 3 and Run 4. Looking further to the future, the next upgrade will require the detector to operate with a huge jump in luminosity, where vertexing will pose a significant challenge. Proposals under consideration include a new “4D” pixel detector with time-stamp information per hit, which could conceivably be achieved by moving to a smaller CMOS node. At this stage, however, the collaboration is actively investigating all options, with detailed technical design reports expected towards the middle of the decade.
The ATLAS collaboration upgraded its original pixel detector in 2014, adding an innermost layer to create a four-layer device. The new layer contained a much smaller pitch, 3D sensors at large angles and CO2 cooling, and the pixel tracker will continue to serve ATLAS throughout LHC Run 3. Like CMS, the collaboration has long been working towards the replacement of the full inner tracker during the next long shutdown expected in 2025, in preparation for HL-LHC operations. The innermost layers of this state-of-the-art all-silicon tracker, called the ITk, will be built from pixel detectors with an area almost 10 times larger than that of the current device. With 13 m2 of active silicon across five barrel layers and two end caps, the pixel detector will contribute to precision tracking up to a pseudorapidity |η| = 4, with the innermost two layers expected to be replaced a few years into the HL-LHC era, and the outermost layers designed to last the lifetime of the project. Most of the detector will use planar silicon sensors, with 3D sensors (which are more radiation-hard and less power-hungry) in the innermost layer. Like the CMS Phase-2 pixel upgrade, the sensors will be read out by new chips being developed by the RD53 collaboration, with support structures made of low-mass carbon materials and cooling provided by evaporative CO2 flowing in thin-walled pipes. The device will have a total of 5.1 Gpixels (55 times more than the current one), and the very high expected HL-LHC data rates, especially in the innermost layers, will require the development of new technologies for high-bandwidth transmission and handling. The ITk pixel detector is now in the final stages of R&D and moving into production. After that, the final stages of integrating the subdetectors assembled in ATLAS institutes worldwide will take place on the surface at CERN before final installation underground.
Recent measurements bolstering the longstanding tension between the experimental and theoretical values of the muon’s anomalous magnetic moment generated a buzz in the community. Though with a much lower significance, a similar puzzle may also be emerging for the anomalous magnetic moment of the electron, ae.
Depending on which of two recent independent measurements of the fine-structure constant is used in the theoretical calculation of ae – one obtained at Berkeley in 2018 or the other at Kastler–Brossel Laboratory in Paris in 2020 – the Standard Model prediction stands 2.4σ higher or 1.6σ lower than the best experimental value, respectively. Motivated by this inconsistency, the NA64 collaboration at CERN set out to investigate whether new physics – in the form of a lightweight “X boson” – might be influencing the electron’s behaviour.
The generic X boson could be a sub- GeV scalar, pseudoscalar, vector or axial- vector particle. Given experimental constraints on its decay modes involving Standard Model particles, it is presumed to decay predominantly invisibly, for example into dark-sector particles. NA64 searches for X bosons by directing 100 GeV electrons generated by the SPS onto a target, and looking for missing energy in the detector via electron–nuclei scattering e–Z → e–ZX.
The result sets new bounds on the e–X interaction strength
Analysing data collected in 2016, 2017 and 2018, corresponding to about 3 × 1011 electrons-on-target, the NA64 team found no evidence for such events. The result sets new bounds on the e–X interaction strength and, as a result, on the contributions of X bosons to ae: X bosons with a mass below 1 GeV could contribute at most between one part in 1015 and one part in 1013, depending on the X-boson type and mass. These contributions are too small to explain the current anomaly in the electron’s anomalous magnetic moment, says NA64 spokesperson Sergei Gninenko. “But the fact that NA64 reached an experimental sensitivity that is better than the current accuracy of the direct measurements of ae, and of recent high-precision measurements of the fi ne-structure constant, is amazing.”
In a separate analysis, the NA64 team carried out a model-independent search for a particular pseudoscalar X boson with a mass of around 17 MeV. Coupling to electrons and decaying into e+e– pairs, the so-called “X17” has been proposed to explain an excess of e+e– pairs created during nuclear transitions of excited 8Be and 4He nuclei reported by the “ATOMKI” experiment in Hungary since 2015.
The e-X17 coupling strength is constrained by data: too large and the X17 would contribute too much to ae; too small and the X17 would decay too rarely and too far away from the ATOMKI target. In 2019, the NA64 team excluded a large range of couplings, although at large values, for a vector-like X17. More recently, they searched for a pseudoscalar X17, which has a lifetime about half that of the vector version for the same coupling strength. Re-analysing a sample of approximately 8.4 × 1010 electrons-on-target collected in 2017 and 2018 with 100 and 150 GeV electrons, respectively, the collaboration has now excluded couplings in the range 2.1–3.2 × 10–4 for a 17 MeV X-boson.
“We plan to further improve the sensitivity to vector and pseudoscalar X17’s after long shutdown 2, and also try to reconstruct the mass of X17, to be sure that if we see the signal it is the ATOMKI boson,” says Gninenko.
Naples, 1938. Ettore Majorana, one of the physics geniuses of the 20th century, disappears mysteriously and never comes back. A tragedy, and a mystery that has captivated many writers.
The latest oeuvre, Nils Barrellon’s Le Neutrino de Majorana, is a French-language detective novel situated somewhere at the intersection of physics history and science outreach. Beginning with Majorana’s birth in 1906, Barrellon highlights the events that shaped and established quantum mechanics. With factual moments and original letters, he focuses on Majorana’s personal and scholarly life, while putting a spotlight on the ragazzi di via Panisperna and other European physicists who had to face the Second World War. In parallel, a present-day neutrino physicist is found killed right at the border of France and Switzerland. Majorana’s volumetti (his unpublished research notes) become the leitmotif unifying the two stories. Barrellon compares the two eras of research by entangling the storylines to reach a dramatic climax.
Using the crime hook as the predominant storyline, the author keeps the lay reader on the edge of their seat, while comically playing with subtleties most Cernois would recognise, from cultural differences between the two bordering countries to clichés about particle physicists, via passably detailed procedures of access to the experimental facilities – a clear proof of the author (who is also a physics school teacher) having been on-site. The novel feels like a tailor-made detective story for the entertainment of physicists and physics enthusiasts alike.
And, at the end of the day, what explanation for Majorana’s disappearance could be more soothing than a love story?
Today, the tools of experimental particle physics are ubiquitous in hospitals and biomedical research. Particle beams damage cancer cells; high-performance computing infrastructures accelerate drug discoveries; computer simulations of how particles interact with matter are used to model the effects of radiation on biological tissues; and a diverse range of particle-physics-inspired detectors, from wire chambers to scintillating crystals to pixel detectors, all find new vocations imaging the human body.
CERN has actively pursued medical applications of its technologies as far back as the 1970s. At that time, knowledge transfer happened – mostly serendipitously – through the initiative of individual researchers. An eminent example is Georges Charpak, a detector physicist of outstanding creativity who invented the Nobel-prize-winning multiwire proportional chamber (MWPC) at CERN in 1968. The MWPC’s ability to record millions of particle tracks per second opened a new era for particle physics (CERN Courier December 1992 p1). But Charpak strived to ensure that the technology could also be used outside the field – for example in medical imaging, where its sensitivity promised to reduce radiation doses during imaging procedures – and in 1989 he founded a company that developed an imaging technology for radiography which is currently deployed as an orthopaedic application. Following his example, CERN has continued to build a culture of entrepreneurship ever since.
Triangulating tumours
Since as far back as the 1950s, a stand-out application for particle-physics detector technology has been positron-emission tomography (PET) – a “functional” technique that images changes in the metabolic process rather than anatomy. The patient is injected with a compound carrying a positron-emitting isotope, which accumulates in areas of the body with high metabolic activity (the uptake of glucose, for example, could be used to identify a malignant tumour). Pairs of back-to-back 511 keV photons are detected when a positron annihilates with an electron in the surrounding matter, allowing the tumour to be triangulated.
Pioneering developments in PET instrumentation took place in the 1970s. While most scanners were based on scintillating crystals, the work done with wire chambers at the University of California at Berkeley inspired CERN physicists David Townsend and Alan Jeavons to use high-density avalanche chambers (HIDACs) – Charpak’s detector plus a photon-conversion layer. In 1977, with the participation of CERN radiobiologist Marilena Streit-Bianchi, this technology was used to create some of the first PET images, most famously of a mouse. The HIDAC detector later contributed significantly to 3D PET image reconstruction, while a prototype partial-ring tomograph developed at CERN was a forerunner for combined PET and computed tomography (CT) scanners. Townsend went on to work at the Cantonal Hospital in Geneva and then in the US, where his group helped develop the first PET/CT scanner, which combines functional and anatomic imaging.
Crystal clear
In the onion-like configuration of a collider detector, an electromagnetic calorimeter often surrounds a descendant of Charpak’s wire chambers, causing photons and electrons to cascade and measuring their energy. In 1991, to tackle the challenges posed by future detectors at the LHC, the Crystal Clear collaboration was formed to study innovative scintillating crystals suitable for electromagnetic calorimetry. Since its early years, Crystal Clear also sought to apply the technology to other fields, including healthcare. Several breast, pancreas, prostate and animal-dedicated PET scanner prototypes have since been developed, and the collaboration continues to push the limits of coincidence-time resolution for time-of-flight (TOF) PET.
In TOF–PET, the difference between the arrival times of the two back-to-back photons is recorded, allowing the location of the annihilation along the axis connecting the detection points to be pinned down. Better time resolution therefore improves image quality and reduces the acquisition time and radiation dose to the patient. Crystal Clear continues this work to this day through the development of innovative scintillating-detector concepts, including at a state-of-the-art laboratory at CERN.
The dual aims of the collaboration have led to cross-fertilisation, whereby the work done for high-energy physics spills over to medical imaging, and vice versa. For example, the avalanche photodiodes developed for the CMS electromagnetic calorimeter were adapted for the ClearPEM breast-imaging prototype, and technology developed for detecting pancreatic and prostate cancer (EndoTOFPET-US) inspired the “barrel timing layer” of crystals that will instrument the central portion of the CMS detector during LHC Run 3.
Pixel perfect
In the same 30-year period, the family of Medipix and Timepix read-out chips has arguably made an even bigger impact on med-tech and other application fields, becoming one of CERN’s most successful technology-transfer cases. Developed with the support of four successive Medipix collaborations, involving a total of 37 research institutes, the technology is inspired by the high-resolution hybrid pixel detectors initially developed to address the challenges of particle tracking in the innermost layers of the LHC experiments. In hybrid detectors, the sensor array and the read-out chip are manufactured independently and later coupled by a bump-bonding process. This means that a variety of sensors can be connected to the Medipix and Timepix chips, according to the needs of the end user.
The first Medipix chip produced in the 1990s by the Medipix1 collaboration was based on the front-end architecture of the Omega3 chip used by the half-million-pixel tracker of the WA97 experiment, which studied strangeness production in lead–ion collisions. The upgraded Medipix1 chip also included a counter per pixel. This demonstrated that the chips could work like a digital camera, providing high-resolution, high-contrast and noise-hit-free images, making them uniquely suitable for medical applications. Medipix2 improved spatial resolution and produced a modified version called Timepix that offers time or amplitude measurements in addition to hit counting. Medipix3 and Timepix3 then allowed the energy of each individual photon to be measured – Medipix3 allocates incoming hits to energy bins in each pixel, providing colour X-ray images, while Timepix3 times hits with a precision of 1.6 ns, and sends the full hit data – coordinate, amplitude and time – off chip. Most recently, the Medipix4 collaboration, which was launched in 2016, is designing chips that can seamlessly cover large areas, and is developing new read-out architectures, thanks to the possibility of tiling the chips on all four sides.
Medipix and Timepix chips find applications in widely varied fields, from medical imaging to cultural heritage, space dosimetry, materials analysis and education. The industrial partners and licence holders commercialising the technology range from established enterprises to start-up companies. In the medical field, the technology has been applied to X-ray CT prototype systems for digital mammography, CT imagers for mammography, and beta- and gamma-autoradiography of biological samples. In 2018 the first 3D colour X-ray images of human extremities were taken by a scanner developed by MARS Bioimaging Ltd, using the Medipix3 technology. By analysing the spectrum recorded in each pixel, the scanner can distinguish multiple materials in a single scan, opening up a new dimension in medical X-ray imaging: with this chip, images are no longer black and white, but in colour (see “Colour X-ray” image).
Although the primary aim of the Timepix3 chip was applications outside of particle physics, its development also led directly to new solutions in high-energy physics, such as the VELOpix chip for the ongoing LHCb upgrade, which permits data-driven trigger-free operation for the first time in a pixel vertex detector in a high-rate experiment.
Dosimetry
CERN teams are also exploring the potential uses of Medipix technology in dosimetry. In 2019, for example, Timepix3 was employed to determine the exposure of medical personnel to ionising radiation in an interventional radiology theatre at Christchurch Hospital in New Zealand. The chip was able to map the radiation fluence and energy spectrum of the scattered photon field that reaches the practitioners, and can also provide information about which parts of the body are most exposed to radiation.
Meanwhile, “GEMPix” detectors are being evaluated for use in quality assurance in hadron therapy. GEMPix couples gas electron multipliers (GEMs) – a type of gaseous ionisation detector developed at CERN – with the Medipix integrated circuit as readout to provide a hybrid device capable of detecting all types of radiation with a high spatial resolution. Following initial results from tests on a carbon-ion beam performed at the National Centre for Oncological Hadrontherapy (CNAO) in Pavia, Italy, a large-area GEMPix detector with an innovative optical read-out is now being developed at CERN in collaboration with the Holst Centre in the Netherlands. A version of the GEMPix called GEMTEQ is also currently under development at CERN for use in “microdosimetry”, which studies the temporal and spatial distributions of absorbed energy in biological matter to improve the safety and effectiveness of cancer treatments.
As a publicly funded laboratory, CERN has a remit, in addition to its core mission to perform fundamental research in particle physics, to expand the opportunities for its technology and expertise to deliver tangible benefits to society. The CERN Knowledge Transfer group strives to maximise the impact of CERN technologies and know-how on society in many ways, including through the establishment of partnerships with clinical, industrial and academic actors, support to budding entrepreneurs and seed funding to CERN personnel.
Supporting the knowledge-transfer process from particle physics to medical research and the med-tech industry is a promising avenue to boost healthcare innovation and provide solutions to present and future health challenges. CERN has provided a framework for the application of its technologies to the medical domain through a dedicated strategy document approved by its Council in June 2017. CERN will continue its efforts to maximise the impact of the laboratory’s know-how and technologies on the medical sector.
Two further dosimetry applications illustrate how technologies developed for CERN’s needs have expanded into commercial medical applications. The B-RAD, a hand-held radiation survey meter designed to operate in strong magnetic fields, was developed by CERN in collaboration with the Polytechnic of Milan and is now available off-the-shelf from an Italian company. Originally conceived for radiation surveys around the LHC experiments and inside ATLAS with the magnetic field on, it has found applications in several other tasks, such as radiation measurements on permanent magnets, radiation surveys at PET-MRI scanners and at MRI-guided radiation therapy linacs. Meanwhile, the radon dose monitor (RaDoM) tackles exposure to radon, a natural radioactive gas that is the second leading cause of lung cancer after smoking. The RaDoM device directly estimates the dose by reproducing the energy deposition inside the lung instead of deriving the dose from a measurement of radon concentration in air; CERN also developed a cloud-based service to collect and analyse the data, to control the measurements and to drive mitigation measures based on real time data. The technology is licensed to the CERN spin-off BAQ.
Cancer treatments
Having surveyed the medical applications of particle detectors, we turn to the technology driving the beams themselves. Radiotherapy is a mainstay of cancer treatment, using ionising radiation to damage the DNA of cancer cells. In most cases, a particle accelerator is used to generate a therapeutic beam. Conventional radiation therapy uses X-rays generated by a linac, and is widely available at relatively low cost.
Medipix and Timepix read-out chips have become one of CERN’s most successful technology-transfer cases
Radiotherapy with protons was first proposed by Fermilab’s founding director Robert Wilson in 1946 while he was at Berkeley, and interest in the use of heavier ions such as carbon arose soon after. While X-rays lose energy roughly exponentially as they penetrate tissue, protons and other ions deposit almost all of their energy in a sharp “Bragg” peak at the very end of their path, enabling the dose to be delivered on the tumour target, while sparing the surrounding healthy tissues. Carbon ions have the additional advantage of a higher radiobiological effectiveness, and can control tumours that are radio-resistant to X-rays and protons. Widespread adoption of hadron therapy is, however, limited by the cost and complexity of the required infrastructures, and by the need for more pre-clinical and clinical studies.
PIMMS and NIMMS
Between 1996 and 2000, under the impulsion of Ugo Amaldi, Meinhard Regler and Phil Bryant, CERN hosted the Proton-Ion Medical Machine Study (PIMMS). PIMMS produced and made publicly available an optimised design for a cancer-therapy synchrotron capable of using both protons and carbon ions. After further enhancement by Amaldi’s TERA foundation, and with seminal contributions from Italian research organisation INFN, the PIMMS concept evolved into the accelerator at the heart of the CNAO hadron therapy centre in Pavia. The MedAustron centre in Wiener Neustadt, Austria, was then based on the CNAO design. CERN continues to collaborate with CNAO and MedAustron by sharing its expertise in accelerator and magnet technologies.
In the 2010s, CERN teams put to use the experience gained in the construction of Linac 4, which became the source of proton beams for the LHC in 2020, and developed an extremely compact high-frequency radio-frequency quadrupole (RFQ) to be used as injector for a new generation of high-frequency, compact linear accelerators for proton therapy. The RFQ accelerates the proton beam to 5 MeV after only 2 m, and operates at 750 MHz – almost double the frequency of conventional RFQs. A major advantage of using linacs for proton therapy is the possibility of changing the energy of the beam, and hence the depth of treatment in the body, from pulse to pulse by switching off some of the accelerating units. The RFQ technology was licensed to the CERN spin-off ADAM, now part of AVO (Advanced Oncotherapy), and is being used as an injector for a breakthrough linear proton therapy machine at the company’s UK assembly and testing centre at STFC’s Daresbury Laboratory.
In 2019 CERN launched the Next Ion Medical Machine Study (NIMMS) to develop cutting-edge accelerator technologies for a new generation of compact and cost-effective ion-therapy facilities. The goal is to propel the use of ion therapy, given that proton installations are already commercially available and that only four ion centres exist in Europe, all based on bespoke solutions.
NIMMS is organised along four different lines of activities. The first aims to reduce the footprint of facilities by developing new superconducting magnet designs with large apertures and curvatures, and for pulsed operation. The second is the design of a compact linear accelerator optimised for installation in hospitals, which includes an RFQ based on the design of the proton therapy RFQ, and a novel source for fully-stripped carbon ions. The third concerns two innovative gantry designs, with the aim of reducing the size, weight and complexity of the massive magnetic structures that allow the beam to reach the patient from different angles: the SIGRUM lightweight rotational gantry originally proposed by TERA, and the GaToroid gantry invented at CERN which eliminates the need to mechanically rotate the structure by using a toroidal magnet (see figure “GaToroid”). Finally, new high-current synchrotron designs will be developed to reduce the cost and footprint of facilities while reducing the treatment time compared to present European ion-therapy centres: these will include a superconducting and a room-temperature option, and advanced features such as multi-turn injection for 1010 particles per pulse, fast and slow extraction, and multiple ion operation. Through NIMMS, CERN is contributing to the efforts of a flourishing European community, and a number of collaborations have been already established.
Another recent example of frontier radiotherapy techniques is the collaboration with Switzerland’s Lausanne University Hospital (CHUV) to build a new cancer therapy facility that would deliver high doses of radiation from very-high-energy electrons (VHEE) in milliseconds instead of minutes. The goal here is to exploit the so-called FLASH effect, wherein radiation doses administered over short time periods appear to damage tumours more than healthy tissue, potentially minimising harmful side-effects. This pioneering installation will be based on the high-gradient accelerator technology developed for the proposed CLIC electron–positron collider. Various research teams have been performing their biomedical research related to VHEE and FLASH at the CERN Linear Electron Accelerator for Research (CLEAR), one of the few facilities available for characterising VHEE beams.
Radioisotopes
CERN’s accelerator technology is also deployed in a completely different way to produce innovative radioisotopes for medical research. In nuclear medicine, radioisotopes are used both for internal radiotherapy and for diagnosis of cancer and other diseases, and progress has always been connected to the availability of novel radioisotopes. Here, CERN has capitalised on the experience of its ISOLDE facility, which during the past 30 years has the proton beam from the CERN PS Booster to produce 1300 different isotopes from 73 chemical elements for research ranging from nuclear physics to the life sciences. A new facility, called ISOLDE-MEDICIS, is entirely dedicated to the production of unconventional radioisotopes with the right properties to enhance the precision of both patient imaging and treatment. In operation since late 2017, MEDICIS will expand the range of radioisotopes available for medical research – some of which can be produced only at CERN – and send them to partner hospitals and research centres for further studies. During its 2019 and 2020 harvesting campaigns, for example, MEDICIS demonstrated the capability of purifying isotopes such as 169Er or 153Sm to new purity grades, making them suitable for innovative treatments such as targeted radioimmunotherapy.
Data handling and simulations
The expertise of particle physicists in data handling and simulation tools are also increasingly finding applications in the biomedical field. The FLUKA and Geant4 simulation toolkits, for example, are being used in several applications, from detector modelling to treatment planning. Recently, CERN contributed its know-how in large-scale computing to the BioDynaMo collaboration, initiated by CERN openlab together with Newcastle University, which initially aimed to provide a standardised, high-performance and open-source platform to support complex biological simulations (see figure “Computational neuroscience”). By hiding its computational complexity, BioDynaMo allows researchers to easily create, run and visualise 3D agent-based simulations. It is already used by academia and industry to simulate cancer growth, accelerate drug discoveries and simulate how the SARS-CoV-2 virus spreads through the population, among other applications, and is now being extended beyond biological simulations to visualise the collective behaviour of groups in society.
The expertise of particle physicists in data handling and simulation tools are increasingly finding applications in the biomedical field
Many more projects related to medical applications are in their initial phases. The breadth of knowledge and skills available at CERN was also evident during the COVID-19 pandemic when the laboratory contributed to the efforts of the particle-physics community in fields ranging from innovative ventilators to masks and shields, from data management tools to open-data repositories, and from a platform to model the concentration of viruses in enclosed spaces to epidemiologic studies and proximity-sensing devices, such as those developed by Terabee.
Fundamental research has a priceless goal: knowledge for the sake of knowledge. The theories of relativity and quantum mechanics were considered abstract and esoteric when they were developed; a century later, we owe to them the remarkable precision of GPS systems and the transistors that are the foundation of the electronics-based world we live in. Particle-physics research acts as a trailblazer for disruptive technologies in the fields of accelerators, detectors and computing. Even though their impact is often difficult to track as it is indirect and diffused over time, these technologies have already greatly contributed to the advances of modern medicine and will continue to do so.
The ability of certain neutral mesons to oscillate between their matter and antimatter states at distinctly unworldly rates is a spectacular feature of quantum mechanics. The phenomenon arises when the states are orthogonal combinations of narrowly split mass eigenstates that gain a relative phase as the wavefunction evolves, allowing quarks and antiquarks to be interchanged at a rate that depends on the mass difference. Forbidden at tree level, proceeding instead via loops, such fl avour-changing neutral-current processes offer a powerful test of the Standard Model and a sensitive probe of physics beyond it.
Only four known meson systems can oscillate
Predicted by Gell-Mann and Pais in the 1950s, only four known meson systems (those containing quarks from different generations) can oscillate. K0–K0 oscillations were observed in 1955, B0–B0 oscillations in 1986 at the ARGUS experiment at DESY, and Bs0–Bs0 oscillations in 2006 by the CDF experiment at Fermilab. Following the first evidence of charmed-meson oscillations (D0–D0) at Belle and BaBar in 2007, LHCb made the first single-experiment observation confirming the process in 2012. Being relatively slow (more than 100 times the average lifetime of a D0 meson), the full oscillation period cannot be observed. Instead, the collaboration looked for small changes in the flavour mixture of the D0 mesons as a function of the time at which they decay via the Kπ final state.
On 4 June, during the 10th International Workshop on CHARM Physics, the LHCb collaboration reported the first observation of the mass difference between the D0–D0states, precisely determining the frequency of the oscillations. The value represents one of the smallest ever mass differences between two particles: 6.4 × 10–6 eV, corresponding to an oscillation rate of around 1.5 × 109 per second. Until now, the measured value of the mass-difference between the underlying D0 and D0 eigenstates was marginally compatible with zero. By establishing a non-zero value with high significance, the LHCb team was able to show that the data are consistent with the Standard Model, while significantly improving limits on mixing-induced CP violation in the charm sector.
“In the future we hope to discover time-dependent CP violation in the charm system, and the precision and luminosity expected from LHCb upgrades I and II may make this possible,” explains Nathan Jurik, a CERN fellow who worked on the analysis.
The latest measurements of neutral charm–meson oscillations follow hot on the heels of an updated LHCb measurement of the Bs0–Bs0 oscillation frequency announced in April, based on the heavy and light strange-beauty-meson mass difference. The very high precision of the Bs0–Bs0 measurement provides one of the strongest constraints on physics beyond the Standard Model. Using a large sample of Bs0 → Ds– π+ decays, the new measurement improves upon the previous precision of the oscillation frequency by a factor of two: Δms = 17.7683 ± 0.0051 (stat) ± 0.0032 (sys) ps–1 which, when combined with previous LHCb measurements, gives a value of 17.7656 ± 0.0057 ps–1. This corresponds to an oscillation rate of around 3 × 1012 per second, the highest of all four meson systems.
Three expert panellists will introduce the motivation for and status of the proposed Future Circular Collider at CERN, followed by a discussion and live questions from the audience, moderated by CERN Courier editor Matthew Chalmers.
» Accelerator physicist and FCC study leader Michael Benedikt (CERN/Vienna University of Technology) will report on the status and scope of the FCC Innovation Study, a European Union-funded project to assess the technical and financial feasibility of a 100 km electron-positron and proton-proton collider in the Geneva region.
» Experimental particle physicist Beate Heinemann (DESY/Albert-Ludwigs-Universität Freiburg) will explain how the Higgs boson opens a new window on fundamental physics, and why a post-LHC collider is essential to explore this and other hot topics such as flavour physics.
» Theoretical physicist Matthew McCullough (CERN) will explore the potential of a future circular collider to address the dark sector of the universe, and explain the importance of striving for the highest energies possible.
Michael Benedikt (left) completed his PhD on medical accelerators as a member of the CERN Proton-Ion Medical Machine Study group. He joined CERN’s accelerator operation group in 1997, where he headed different sections before becoming deputy group leader from 2006 to 2013. From 2008 to 2013, he was project leader for the accelerator complex for the MedAustron hadron therapy in Austria, and since 2013 he has led the Future Circular Collider Study at CERN.
Beate Heinemann (middle) completed her PhD at the University of Hamburg in 1999 in experimental particle physics at the HERA collider in Hamburg. She became a lecturer at the University of Liverpool in 2003, a professor at UC Berkeley in 2006 and a scientist at Lawrence Berkeley National Laboratory. She was deputy spokesperson of the ATLAS collaboration from 2013 to 2017, and since 2016 she is a leading scientist at DESY and W3 professor at Albert-Ludwigs-Universität Freiburg.
Matthew Mccullough (right) is a senior staff member in the CERN Theory Department. He completed his undergraduate and PhD degrees at the University of Oxford, followed by postdocs at MIT and CERN. His research interests cover physics beyond the Standard Model, from the origins of the Higgs boson to the nature of dark matter.
In the coming decade, the study of nucleus–nucleus, proton–nucleus and proton–proton collisions at the LHC will offer rich opportunities for a deeper exploration of the quark–gluon plasma (QGP). An expected 10-fold increase in the number of lead–lead (Pb–Pb) collisions should both increase the precision of measurements of known probes of the QGP medium as well as give access to new ones. By focusing on rare probes down to very low transverse momentum, such as heavy-flavour particles, quarkonium states, real and virtual photons, as well the study of jet quenching and exotic heavy nuclear states, very large data samples will be required.
To seize these opportunities, the ALICE collaboration has undertaken a major upgrade of its detectors to increase the event readout, online data processing and recording capabilities by nearly two orders of magnitude (CERN Courier January/February 2019 p25). This will allow Pb–Pb minimum-bias events to be recorded at rates in excess of 50 kHz, which is the expected Pb–Pb interaction rate at the LHC in Run 3, as well as proton–lead (p–Pb) and proton–proton (pp) collisions at rates of about 500 kHz and 1 MHz, respectively. In addition, the upgrade will improve the ability of the ALICE detector to distinguish secondary vertices of particle decays from the interaction vertex and to track very low transverse-momentum particles, allowing measurements of heavy-flavour hadrons and low-mass dileptons with unprecedented precision and down to zero transverse momentum.
These ambitious physics goals have motivated the development of an entirely new inner tracking system, ITS2. Starting from LHC Run 3 next year, the ITS2 will allow pp and Pb–Pb collisions to be read out 100 and 1000 times more quickly than was possible in previous runs, offering superior ability to measure particles at low transverse momenta (see “High impact” figure). Moreover, the inner three layers of the ITS2 feature a material budget three times lower than the original detector, which is also important for improving the tracking performance at low transverse momentum.
With its 10 m2 of active silicon area and nearly 13 billion pixels, the ITS2 is the largest pixel detector ever built. It is also the first detector at the LHC to use monolithic active pixel sensors (MAPS), instead of the more conventional and well-established hybrid pixels and silicon microstrips.
Change of scale
The particle sensors and the associated read-out electronics used for vertexing and tracking detection systems in particle-physics experiments have very demanding requirements in terms of granularity, material thickness, readout speed and radiation hardness. The development of sensors based on silicon-semiconductor technology and read-out integrated circuits based on CMOS technology revolutionised the implementation of such detection systems. The development of silicon microstrips, already successfully used at the Large Electron-Positron (LEP) collider, and, later, the development of hybrid pixel detectors, enabled the construction of tracking and vertexing detectors that meet the extreme requirements – in terms of particle rates and radiation hardness – set by the LHC. As a result, silicon microstrip and pixel sensors are at the heart of the particle-tracking systems in most particle-physics experiments today.
Nevertheless, compromises exist in the implementation of this technology. Perhaps the most significant is the interface between the sensor and the readout electronics, which are typically separate components. To go beyond these limitations and construct detection systems with higher granularity and less material thickness requires the development of new technology. The optimal way to achieve this is to integrate both sensor and readout electronics to create a single detection device. This is the approach taken with CMOS active pixel sensors (APSs). Over the past 20 years, extensive R&D has been carried out on CMOS APSs, making this a viable option for vertexing and tracking detection systems in particle and nuclear physics, although their performance in terms of radiation hardness is not yet at the level of hybrid pixel detectors.
ALPIDE, which is the result of an intensive R&D effort, is the building block of the ALICE ITS2
The first large-scale application of CMOS APS technology in a collider experiment was the STAR PXL detector at Brookhaven’s Relativistic Heavy-Ion Collider in 2014 (CERN Courier October 2015 p6). The ALICE ITS2 has benefitted from significant R&D since then, in particular concerning the development of a more advanced CMOS imaging sensor, named ALPIDE, with a minimum feature size of 180 nm. This has led to a significant improvement in the field of MAPS for single-particle detection, reaching unprecedented performance in terms of signal/noise ratio, spatial resolution, material budget and readout speed.
ALPIDE sensors
ALPIDE, which is the result of an intensive R&D effort carried out by ALICE over the past eight years, is the building block of the ALICE ITS2. The chip is 15 × 30 mm2 in area and contains more than half a million pixels organised in 1024 columns and 512 rows. Its very low power consumption (< 40 mW/cm2) and excellent spatial resolution (~5 μm) are perfect for the inner tracker of ALICE.
In ALPIDE the sensitive volume is a 25 μm-thick layer of high-resistivity p-type silicon (> 1 kΩ cm) grown epitaxially on top of a standard (low-resistivity) CMOS wafer (see “ALPIDE journeys” figure). The electric charge generated by particles traversing the sensitive volume is collected by an array of n–p diodes reverse-biased with a positive potential (~1 V) applied on the n-well electrode and a negative potential (down to a minimum of –6 V) applied to the substrate (backside). The possibility of varying the reverse-bias voltage in the range 1 to 7 V allows control over the size of the depleted volume (the fraction of the sensitive volume where the charge is collected by drift due to the presence of an electric field) and, correspondingly, the charge-collection time. Measurements carried out on sensors with characteristics identical to ALPIDE have shown an average charge-collection time consistently below 15 ns for a typical reverse-bias voltage of 4 V. Applying reverse substrate bias to the ALPIDE sensor also increases the tolerance to non-ionising energy loss to well beyond 1013 1 MeV neq/cm2, which is largely sufficient to meet ALICE’s requirements.
Another important feature of ALPIDE is the use of a p-well to shield the full CMOS circuitry from the epitaxial layer. Only the n-well collection electrode is not shielded. The deep p-well prevents all other n-wells – which contain circuitry – from collecting signal charge from the epitaxial layer, and therefore allows the use of full CMOS and consequently more complex readout circuitry in the pixel. ALICE is the first experiment where this has been used to implement a MAPS with a pixel front-end (amplifier and discriminator) and a sparsified readout within the pixel matrix similar to hybrid sensors. The low capacitance of the small collection electrode (about 2 × 2 μm2), combined with a circuit that performs sparsified readout within the matrix without a free-running clock, keeps the power consumption as low as 40 nW per pixel.
The ITS2 consists of seven layers covering a radial extension from 22 to 430 mm with respect to the beamline (see “Cylindrical structure” figure). The innermost three layers form the inner barrel (IB), while the middle two and the outermost two layers form the outer barrel (OB). The radial position of each layer was optimised to achieve the best combined performance in terms of pointing resolution, momentum resolution and tracking efficiency in the expected high track-density environment of a Pb–Pb collision. It covers a pseudo-rapidity range |η| < 1.22 for 90% of the most luminous beam interaction region, extending over a total surface of 10 m2 and containing about 12.5 Gpixels with binary readout, and is operated at room temperature using water cooling.
Given the small size of the ALPIDE (4.5 cm2), sensors are tiled-up to form the basic detector unit, which is called a stave. It consists of a “space-frame” (a carbon-fibre mechanical support), a “cold plate” (a carbon ply embedding two cooling pipes) and a hybrid integrated circuit (HIC) assembly in which the ALPIDE chips are glued and electrically connected to a flexible printed circuit. An IB HIC and an OB HIC include one row of nine chips and two rows of seven chips, respectively. The HICs are glued to the mechanical support: 1 HIC for the IB and 8 or 14 HICs for the two innermost and two outermost layers of the OB, respectively (see “State of the art” figure).
Zero-suppressed hit data are transmitted from the staves to a system of about 200 readout boards located 7 m away from the detector. Data is transmitted serially with a bit-rate up to 1.2 Gb/s over more than 3800 twin-axial cables reaching an aggregate bandwidth of about 2 Tb/s. The readout boards aggregate data and re-transmit it over 768 optical-fibre links to the first-level processors of the combined online/offline (O2) computing farm. The data are then sequenced in frames, each containing the hit information of the collisions occurring in contiguous time intervals of constant duration, typically 22 μs.
The process and procedures to build the HICs and staves are rather complex and time-intensive. More than 10 construction sites distributed worldwide worked together to develop the assembly procedure and to build the components. More than 120 IB and 2500 OB HICs were built using a custom-made automatic module-assembly machine, implementing electrical testing, dimension measurement, integrity inspection and alignment for assembly. A total of 96 IB staves, enough to build two copies of the three IB layers, and a total of 160 OB staves, including 20% spares, have been assembled.
A large cleanroom was built at CERN for the full detector assembly and commissioning activities. Here the same backend system that will be used in the experiment was installed, including the powering system, cooling system, full readout and trigger chains. Staves were installed on the mechanical support structures to form layers, and the layers are assembled in half-barrels, IB (layers 0, 1 and 2) top and bottom and OB (layers 3, 4, 5 and 6) top and bottom. Each stave is then connected to power-supply and readout systems. The commissioning campaign started in May 2019 to fully characterise and calibrate all the detector components, and installations of both the OB and IB were completed in May this year.
Physics ahead
After nearly 10 years of R&D, the upgrade of the ALICE experimental apparatus – which includes an upgraded time projection chamber, a new muon forward tracker, a new fast-interaction trigger detector, forward diffraction detector, new readout electronics and an integrated online–offline computing system – is close to completion. Most of the new or upgraded detectors, including the ITS2, have already been installed in the experimental area and the global commissioning of the whole apparatus will be completed this year, well before the start of Run 3, which is scheduled for the spring of 2022.
The significant enhancements to the performance of the ALICE detector will enable the exploration of new phenomena
The significant enhancements to the performance of the ALICE detector will enable detailed, quantitative characterisation of the high-density, high-temperature phase of strongly interacting matter, together with the exploration of new phenomena. The ITS2 is at the core of this programme. With improved pointing resolution and tracking efficiency at low transverse momentum, it will enable the determination of the total production cross-section of the charm quark. This is fundamental for understanding the interplay between the production of charm quarks in the initial hard scattering, their energy loss in the QGP and possible in-medium thermal production. Moreover, the ITS2 will also make it possible to measure a larger number of different charmed and beauty hadrons, including baryons, opening the possibility for determining the heavy-flavour transport coefficients. A third area where the new ITS will have a major impact is the measurement of electron–positron pairs emitted as thermal radiation during all stages of the heavy-ion collision, which offer an insight into the bulk properties and space–time evolution of the QGP.
More in store
The full potential of the ALPIDE chip underpinning the ITS2 is yet to be fully exploited. For example, a variant of ALPIDE explored by ALICE based on an additional low-dose deep n-type implant to realise a planar junction in the epitaxial layer below the wells containing the CMOS circuitry results in a much faster charge collection and significantly improved radiation hardness, paving the way for sensors that are much more resistant to radiation.
Further improvements to MAPS for high-energy physics detectors could come by exploiting the rapid progress in imaging for consumer applications. One of the features offered recently by CMOS imaging sensor technologies, called stitching, will enable a new generation of MAPS with an area up to the full wafer size. Moreover, the reduction in the sensor thickness to about 30–40 μm opens the door to large-area curved sensors, making it possible to build a cylindrical layer of silicon-only sensors with a further significant reduction in the material thickness. The ALICE collaboration is already preparing a new detector based on these concepts, which consists of three cylindrical layers based on curved wafer-scale stitched sensors (see “Into the future” figure). This new vertex detector will be installed during Long Shutdown 3 towards the middle of the decade, replacing the three innermost layers of the ITS2. With the first detection layer closer to the interaction point (from 23 to 18 mm) and a reduction in the material budget close to the interaction point by a factor of six, the new vertex detector will further improve the tracking precision and efficiency at low transverse momentum.
The technologies developed by ALICE for the ITS2 detector are now being used or considered for several other applications in high-energy physics, including the vertex detector of the sPHENIX experiment at RHIC, and the inner tracking system for the NICA MPD experiment at JINR. The technology is also being applied to areas outside of the field, including in medical and space applications. The Bergen pCT collaboration and INFN Padova’s iMPACT project, for example, are developing novel ALPIDE-based devices for clinical particle therapy to reconstruct 3D human body images. The HEPD02 detector for the Chinese–Italian CSES-02 mission, meanwhile, includes a charged-particle tracker made of three layers of ALPIDE sensors that represents a pioneering test for next-generation space missions. Driven by a desire to learn more about the fundamental laws of nature, it is clear that advanced silicon-tracker technology continues to make an impact on wider society, too.
CERN technologies and personnel make it a hub for so much more than exploring the fundamental laws of the universe. In an event organised by the CERN Alumni Relations team on 30 April, five CERN alumni who now work in the environmental industry discussed how their high-energy physics training helped them to get to where they are today.
One panellist, Zofia Rudjor, used to work on the ATLAS trigger system and the measurement of the Higgs-boson decays to tau leptons. Having spent 10 years at CERN, and with the discovery of the Higgs still fresh in the memory, she now works as a data scientist for the Norwegian Institute for Water Research (NIVA). “For my current role, a lot of the skills that I acquired at CERN, from solving complex problems to working with real-time data streams, turned out to be very key and useful,” she said at the virtual April event. Similar sentiments were shared by fellow panelist Manel Sanmarti, a former cryogenic engineer who is now the co-founder of Bamboo Energy Platform: “CERN is kind of the backbone of my career – it’s really excellent. I would say it’s the ‘Champions League’ of technology!”
However, much learning and preparation is also required to transition from particle physics to the environment. Charlie Cook began his career as an engineer at CERN and is now the founder of Rightcharge, a company which helps electric car drivers reduce the cost of charging and to use cleaner energy sources. Before taking the plunge into the environmental industry, he first completed a course at Imperial College Business School on climate-change management and finance, which helped him “learn the lingo” in the finance world. A stint at Octopus Electric Vehicles was followed by driving a domestic vehicle-to-grid demonstration project called Powerloop which launched at the beginning of 2018. “Sometimes it’s too easy to start talking in abstract terms about sustainability, but, to really understand things I like to see the numbers behind everything,” he said.
Everything that is happening in the environmental field today is all because of policymakers
Mario Michan, CEO of Daphne Technology (a company focused on enabling industries to decarbonise), and a former investigator of antihydrogen at CERN’s Antiproton Decelerator, also stressed the importance of being familiar with how the sector works, pointing out the large role that policymakers take in the field: “Everything that is happening in the environmental field today is all because of policymakers,” he remarked.
Another particle physicist who made the change is Giorgio Cortiana, who now works at E.ON’s global advanced analytics and artificial intelligence leading several data-science projects. His scientific background in complex physics data analysis, statistics, machine learning and object-oriented programming is ideal for extracting meaningful insights from large datasets, and for coping with everyday problems that need quick and effective solutions, he explained, noting the different mentality from academia. “At CERN you have the luxury to really focus on your research, down to the tiny details — now, I have to be a bit more pragmatic,” he said. “Here [at E.ON] we are instead looking to try and make an impact as soon as we can.
Leaving the field
The decision to leave the familiar surroundings of high-energy physics requires perseverance, stressed Rudjor, stating that it is important to pick up the phone to find out what type of position is really being offered. Other panelists also noted that it is vital to spend some time to look at what skills you can bring for a specific posting. “I think there are many workplaces which don’t really know how to recruit people with our skills – they would like the people, but they typically don’t open positions because they don’t know exactly how to specify the job.”
The CERN Alumni Network’s “Moving Out of Academia” events provide a rich source of candid advice for those seeking to make the change, while also demonstrating the impact of high-energy physics in broader society. The latest environment-industry events follow others dedicated to careers in finance, industrial engineering, big data, entrepreneurship and medical technologies. More are in store, explains head of CERN Alumni Relations, Rachel Bray. “One of our goals is to support those in their early careers – if and when they decide to leave academia for another sector. In addition to the Moving out of Academia events, we have recently launched a new series which brings together early-career scientists and the companies seeking the talents and skills developed at CERN.”
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.