Sylvie Rosier-Lees left us on 14 March 2022 following a long illness, which she endured with immense courage. Following her studies at the Ecole normale supérieure de Fontenay-Saint-Cloud, Sylvie began her research career in 1985 with a thesis on the L3 experiment at LEP. There were several of us – new to the Laboratoire d’Annecy de Physique des Particules (LAPP) at the time – with the idea of strengthening the existing L3 team, and Sylvie was our first student. Her inquisitive mind, tenacity and ability to face experimental problems – in particular concerning the calibration of the BGO crystals – quickly made her stand out within the collaboration.
Before becoming a highly regarded specialist in supersymmetry, she studied the identification of B mesons produced in Z decays, which made it possible to contribute to the first measurements of the B°–B° mixing parameter as well as the forward–backward asymmetry. Supersymmetry and the search for the neutralino set Sylvie on the quest for dark matter, to which she subsequently dedicated her entire career. In 2000 she joined the Alpha Magnetic Spectrometer (AMS) collaboration – a particle-physics detector installed on the International Space Station to identify and measure fluxes of cosmic rays. She took over responsibility for the readout electronics of the electromagnetic calorimeter, introducing independent rapid triggering based solely on calorimetry. Resistance to radiation, extended temperature range, low power consumption and operation in vacuum were all technical challenges that were met thanks to Sylvie’s scientific rigour and exceptional human qualities. She had an enthusiasm and leadership that motivated and led to the success of everyone in her team. More than 15 years after its completion in 2005 and more than 10 years after its first signal on 19 May 2011, the calorimeter’s electronics are still smoothly providing data. With AMS, she searched for dark matter via deformations of antiparticle fluxes, for example in the fraction of positrons detected, one of the first AMS publications
In 2005 Sylvie created a HESS group at LAPP. The Namibia-based gamma-ray telescope had entered its second phase with the construction of the fifth telescope, the largest with a focal length of 36 m. Sylvie took up the challenge of an ambitious mechanical project to load and unload the camera – a cube 2.5 m high and weighing 2.6 tonnes – from its data-taking position 5 m from the ground to a shelter at ground level. She then explored the potential of the facility for dark-matter searches, since its size allowed the detection threshold of photons to be lowered to 50 GeV.
Throughout her career, Sylvie maintained close ties with phenomenologists and theorists. This collaboration began at LEP within the framework of a national supersymmetry group, where she coordinated an influential working group on the Minimal Supersymmetric Standard Model. Subsequently, with theorists, she obtained a lower bound on the mass of the lightest neutralino, a candidate for dark matter, by combining astrophysical, cosmological and collider observables. She also notably contributed to the development of an extension of the still widely used micrOMEGAs code, making it possible to calculate the spectrum of positrons and antiprotons coming from the annihilation of dark-matter particles in the galaxy.
Always positive with students, Sylvie supervised or co-supervised around 10 theses. All these elements earned her the 2017 Joliot-Curie Prize awarded by the French Physical Society. Her enthusiasm, energy and good humour are sorely missed. We are thinking of Jean-Pierre, her husband, and her sons Edouard and Arthur.
State-of-the-art particle accelerators and detectors cannot be bought off the shelf. They come to life in workshops staffed by teams of highly skilled engineers and technicians – such as heavy-machinist Florian Hofmann from Austria, who joined CERN in October 2019.
Florian is one of several hundred engineers and technicians employed by CERN to develop, build and test equipment, and keep it in good working order. He works in the machining and maintenance workshop of the mechanical and materials engineering (MME) group, which acts as a partner to many projects and experiments at CERN. “We tightly collaborate with all CERN colleagues and we offer our production facility and knowledge to meet their needs,” he explains. “Sometimes the engineers, the project leaders or even the scientists come to see how the parts of their work come together. It is a nice and humbling experience for me because I know they have been conceiving components for a very long time. Our doors are open and you don’t need special permission – everyone can come round!”
Before joining CERN, Florian began studying atmospheric physics at the University of Innsbruck. After two semesters, he realised that even though he liked science he preferred not to practise it, so decided to change to engineering and programming. After completing his studies and working in diverse fields such as automotive, tool making and water power plants, he joined CERN. Like many of his colleagues, his expertise and genuine curiosity for his work helps Florian to find tailor-made solutions for CERN’s challenging projects, every one of which is different, he explains. “Years ago the job used to be a traditional mechanics job, but today the cutting-edge technologies involved make this the Formula One of production.”
Heavy metal
Florian is currently working on aluminium joints for the vacuum tank of the kicker magnets for the Proton Synchrotron, a fundamental component on which the technicians collaborate with many other groups. The workshop is also contributing to numerous important projects such as the FRESCA2 cryostat, which is visible at the entry of the workshop, and the crab cavities for the High-Luminosity LHC upgrade. The radio-frequency quadrupole for Linac4, which now drives all proton production at CERN, was built here, as was the cryostat for the ICARUS neutrino detector now taking data at Fermilab and parts of the AMS-02 detector operating on the International Space Station. In the 1960s, the workshop was responsible for the construction of the Big European Bubble Chamber, now an exhibit in the CERN Microcosm.
Today, the cutting-edge technologies involved make this the Formula One of production
Before any heavy-machinery work begins, the machining team simulates the machining process to avoid failures or technical issues during fabrication. Although the software is highly reliable, Florian and his co-workers have to stand by to control and steer the machine, modifying commands when needed and ensuring that the activity is carried out as required. Every machine has one person in charge, the so-called technical referent, but the team receives basic training on multiple machines to allow them to jump onto a different one if necessary. The job stands out for its dynamism, Florian explains. “At the MME workshop, we perform many diverse manufacturing processes needed for accelerator technologies, not only milling and turning of the machine but also welding of exotic materials, among others. The possibilities are countless.”
Florian’s enthusiasm reflects the mindset of the MME workshop team, where everyone is aware of their contributions to the broader science goals of CERN. “This is a team sport. When you join a club you need it to have good management, and I think that here, because of our supervisors and our group responsibility, you are made to feel like everyone is pushing in the same direction.” Being curious, eager to learn and open-minded are important skills for CERN technicians, he adds.
“When you come to CERN you always leave with more than you can bring, because the experience of contributing to science, to bring nations together towards a better world, is really rewarding. I think everybody needs to ask themselves what they want and what kind of world they want to live in.”
The famous “Livingston diagram”, first presented by cyclotron co-inventor Milton Stanley Livingston in 1954, depicts the rise in energy of particle accelerators as a function of time. To assess current and future facilities, however, we need complementary metrics suited to the 21st century. As the 2020 update of the European strategy for particle physics demonstrated, such metrics exist: instead of weighing up colliders solely on the basis of collision energy, they consider the capital cost or energy consumption with respect to the luminosity produced.
Applying these metrics to the LHC shows that the energy used during the upcoming Run 3 will be around three times lower than it was during Run 1 for similar luminosity performance (see “Greener physics” figure). The High-Luminosity LHC (HL-LHC) will operate with even greater efficiency. In fact, CERN accelerators have drawn a similar power for a period of 40 years despite their vastly increased scientific output: from 1 TWh for LEP2 to 1.2 TWh for the LHC and possibly 1.4 TWh at the HL-LHC.
The GWh/fb–1 metric has now been adopted by CERN as a key performance indicator (KPI) for the LHC, as set out in CERN’s second environmental report published last year. It has also been used to weigh up the performance of various Higgs factories. In 2020, for example, studies showed that an electron–positron Future Circular Collider is the most energy efficient of all proposed Higgs factories in the energy range of interest. But this KPI is only part of a larger energy-management effort in which the whole community has an increasingly important role to play.
In 2011, with the aim to share best practices amongst scientific facilities, CERN was at the origin of the Energy for Sustainable Science at Research Infrastructures workshop series. A few years later, prompted by the need for CERN to move from protected-tariff to market-based electricity contracts, the CERN energy management panel was created to establish solid forecasts and robust monitoring tools. Each year since 2017, we send virtual “electricity bills” to all group leaders, department heads and directors, which has contributed to a change of culture in the way CERN views energy management.
Best practice
Along with the market-based energy contract, energy suppliers have a duty by law (with tax-incentive mechanisms) to help their clients consume less. A review of energy consumption and upgrades conducted between CERN and its electricity supplier EDF in 2017 highlighted best practices for operation and refurbishment, leading to the launch of the LHC-P8 (LHCb) heat-recovery project for the new city area of Ferney-Voltaire. Similar actions were proposed for LHC-P1 (ATLAS) to boost the heating plant at CERN’s Meyrin site, and heat recovery has been considered as a design and adjudication parameter for the new Prevessin Computer Centre. Besides an attractive 5–10 year payback time, such programmes make an important contribution to reducing CERN’s carbon footprint.
Energy efficiency and savings are an increasingly important element in each CERN accelerator infrastructure. Completed during Long Shutdown 2, the East Area renovation project led to an extraordinary 90% reduction in energy consumption, while the LHC Injectors Upgrade project also offered an opportunity to improve the injectors’ environmental credentials. Energy economy was also the primary motivation for CERN to adopt new regenerative power converters for its transfer lines. These efforts build on energy savings of up to 100 GWh/y since 2010, for example by introducing free cooling and air-flow optimisation in the CERN Computer Centre, and operating the SPS and the LHC cryogenics with the minimum of necessary machines. CERN buildings are also aligning with energy- efficiency standards, with the renovation of up to two buildings per year planned over the next 10 years.
There will be no future large-scale science projects without major energy-efficiency and recovery objectives
This year, a dedicated team at CERN is being put together concerning alignment with the ISO50001 energy-management standard, which could bring significant subsidies. A preliminary evaluation was conducted in November 2021, demonstrating that 54% of ISO expectations is already in place and a further 15% is easily within reach.
The mantra of CERN’s energy-management panel is “less, better, recover”. We also have to add “credible” to this list, as there will be no future large-scale science projects without major energy-efficiency and recovery objectives. Today and in the future, we must therefore all work to ensure that every MWh of energy consumed brings demonstrable scientific advances.
Theoretical physicist Claude Bouchiat, who was born in Saint-Matré (southern France) on 16 May 1932, passed away in Paris on 25 November. He was a frequent visitor to the CERN theory group.
Bouchiat studied at the École Polytechnique in 1953–1955, and discovered theoretical high-energy physics after listening to a seminar by the late Louis Michel. In 1957, having been impressed by a conference talk given by C N Yang during a short visit to Paris, he decided to extend Michel’s results on the electron spectrum in muon decays to include the effects of parity violation. This work led to the Bouchiat–Michel formula. He then joined the theoretical physics laboratory (newly created by Maurice Lévy) at the University of Orsay where, together with Philippe Meyer, he founded a very active group in theoretical particle physics. In the 1960s, during several visits to CERN, he collaborated with Jacques Prentki. In 1974 Bouchiat and Meyer moved to Paris and established the theoretical physics laboratory at the École Normale Supérieure (ENS).
Bouchiat’s research covered a large domain that extended beyond particle physics. With Prentki and one of us (JI) he studied the leading divergences of the weak interactions, which was a precursor to the introduction of charm, and with Daniele Amati and Jean-Loup Gervais showed how to build dual diagrams satisfying the unitarity constraints. The trio also extended the anomaly equations in the divergence of the axial current to non-abelian theories. In the early 1970s, Bouchiat and collaborators used quantum field theory in the infinite momentum frame to shed light on the parton model. In 1972, with Meyer and JI, he formulated the anomaly cancellation condition for the Standard Model, establishing the vanishing sum of electric charges for quarks and leptons as essential for the mathematical consistency of the theory.
Probably his most influential contribution, carried out with his wife Marie-Anne Bouchiat, was the precise computation of parity-violation effects resulting from virtual Z-boson exchange between electrons and nuclei. They pointed out an enhancement in heavy atoms that rendered the tiny effect amenable to observation. This work opened a new domain of experimental research, starting first at ENS, which played an important role alongside the high-energy experiments at SLAC in confirming the structure of the weak neutral current. Examples of Bouchiat’s contributions outside particle physics include his studies of the elasticity properties of DNA molecules and of the geometrical phases generated by non-trivial space topology in various atomic and solid-state physics systems.
During his 60-year-long career, Claude Bouchiat had a profound influence on the development of French theoretical high-energy physics. He helped nurture generations of young theorists, and many of his former students are well-known physicists today.
With this ambitious book, the authors have produced a unique and excellent account of particle physics that goes way beyond a description of the LHC project. Its 600 pages are a very pleasant, although tough in places, read. The book serves as a highly valuable refresher of modern concepts of particle physics, recalling theoretical ideas as well as explaining advanced detector technologies and analysis methods that set the stage for the LHC experiments and the Higgs-boson discovery. Even though the focus converges on the Higgs boson, the full LHC project and its rich physics playground are well covered, and furthermore embedded in the broader context of particle physics and cosmology, as the subtitle indicates.
In a way, it is a multi-layered book, which makes it appealing for the selective reader. Each layer is in itself of great value and highly recommendable. The overarching presentation is attractive, with great photos, nicely prepared graphics and diagrams, and a clear structure guiding readers through the many chapters. Quite unique are the more than 50 inserted text boxes, typically one to three pages long, which explain in a concise way the concepts used in the main text. Experts may wish to skip some of them, but they are very educational (at least as a refresher) for most readers, as they were for me. The text boxes are ideal for students and science enthusiasts of all ages, although some are more demanding than others.
To start, the authors take the reader off into a substantial 170-page introduction to particle physics in general, and to the Standard Model (SM) in particular. Its theoretical ideas and their mathematical formulations, as well as its key experimental foundation, are clearly presented. The authors also explore with a broad view what the SM cannot explain. Some material in these introductory chapters are the most demanding parts of the book. The theoretical text boxes are a good opportunity for physics students to recall previously-acquired mathematical notions, but they are clearly not meant for non-experts, who can readily skip them and concentrate more on the very nicely documented historical accounts. A short and accessible chapter “Back to the Big Bang” concludes the introductions by embedding particle physics into the broader picture of cosmology.
Next, the LHC and the ATLAS and CMS experiments enter the stage. The LHC project and its history is introduced with a brief reminder of previous hadron colliders (ISR, Spp–S and Tevatron). The presentation of the two general-purpose detectors comes with a short refresher on particle detection and collider experiments. Salient technical features, and collaboration aspects including some historical anecdotes, are covered for ATLAS and CMS. The book continues with the start-up of the machine, including the scary episode of the September 2008 incident, followed by the breathtaking LHC performance after the restart in November 2009 with Runs 1 and 2, until Long Shutdown 2, which began in 2019.
The story of the Higgs-boson discovery is set within a comprehensive framework of the basics of modern analysis tools and methods, a chapter again of special value for students. Ten years later, it is a pleasure to read from insiders how the discovery unfolded, illustrated with plenty of original physics plots and photographs conveying the excitement of the 4 July 2012 announcement. A detailed description of the rich physics harvest testing the Higgs sector as well as challenging the SM in general provides an up-to-date collection of results from the LHC’s first 10 years of physics operations.
A significant chapter “Quest for new physics” follows, giving the reader a good impression of the many searches hunting for physics beyond the SM. Their relations to, and motivations from, theoretical speculations and astroparticle-physics experiments are explained in an accessible and attractive way.
A book about the LHC wouldn’t be complete without an excursion to the physics and detectors of flavour and hot and dense matter. With the dedicated experiments LHCb and ALICE, respectively, the LHC has opened exciting new frontiers for both fields. The authors cover these well in a lean chapter introducing the physics and commenting on the highlights so far.
A look ahead and conclusion round off this impressive document about the LHC’s main mission, the search for the Higgs boson. Much more SM physics has since been extracted, as is amply documented. However, as the last chapter indicates, the journey to find directions to new physics beyond the SM must go on, first with the high-luminosity upgrades of the LHC and its experiments, and then preparing for future colliders reaching either much higher precision on the Higgs-boson properties or higher energies for exploring higher mass particles. Current ideas for such projects that could follow the LHC are briefly introduced.
The authors are not science historians, but central actors as experimental physicists fully immersed in the LHC adventure. They deliver lively first-hand and personal accounts, all while carefully respecting the historical facts. Furthermore, the book is preceded by a bonus track: the reader can enjoy an inspiring and substantial foreword by Carlo Rubbia, founding father and tireless promoter for the LHC project in the 1980s and early 1990s.
I can only enthusiastically recommend this book, which expands significantly on the French version published in 2014, to all interested in the adventure of the LHC.
“If you had to picture a scientist, what would it look like?” That is the question driving the documentary film Picture a Scientist, first released in April 2020 and screened on 10 February this year at the CERN Globe of Science and Innovation. Directed by Emmy-nominated Sharon Shattuck and Ian Cheney, whose previous productions include From This Day Forward (2016) and The Long Coast (2020), respectively, the 97 minute film tackles the difficulties faced by women in STEM careers. It is centered on the experiences of three US researchers – molecular biologist Nancy Hopkins (MIT), chemist Raychelle Burks (St. Edward’s University) and geologist Jane Willenbring (UC San Diego) – among others who have faced various forms of discrimination during their careers.
Hopkins talks about the difficulties she faced as a student in the 1950s and 1960s, when the education system didn’t offer many maths and science lessons to girls, and shares an experience of sexual harassment involving a famous biologist during a lab visit. Willenbring also experienced various mistreatments, including inappropriate nicknames and harassment from a colleague during a 1999 field trip in Antarctica. The film describes how these two anecdotes are just the tip of the iceberg of discrimination that has historically affected female scientists and is still present today. Less visible examples include being ignored in meetings, being treated as a trainee, receiving inappropriate emails and not getting proper credit for work.
Burks, who is Black, explains how the situation is even worse for women of different ethnic groups, as they are even more underrepresented in science. During her childhood, she recalls, most female Black scientists were fictional, such as Star Trek’s communications officer Nyota Uhura.
Being a scientist does not rely on race or gender but only on the love for science
The film highlights the importance of female scientists speaking out to help people see beyond the tip of the iceberg and allow them to act. Hopkins recounts how she once wrote a letter to the president of MIT in which she described systemic and invisible discrimination such as office space being larger for men than for women. Supported and encouraged by female colleagues, it led to a request to the dean of MIT for greater equality. Another example ultimately led the president of Boston University to dismiss the male researcher who had bullied Willenbring, after receiving many reports of gender harassment.
However, even though progress has been made, the film makes it clear – for example through graphs showing the considerable underrepresentation of women in science – that there is still much to do. “By its own nature science itself should be always evolving,” says Burks: we should be able to identify the idea of a scientist as someone fascinated about research rather than based on its stereotype.
Videos recreating scenes of the bullying described and footage from old TV shows showing the historical mistreatment of women complement candid accounts from those who have experienced discrimination, allowing the viewer to understand their experiences in an impactful way. Some scenes are hard to watch, but are necessary to understand the problem and therefore take steps to increase the recognition of women in STEM careers.
This film raises the often silenced voice of female scientists who have been discriminated against, and makes it clear that being a scientist does not rely on race or gender but only on the love for science. “If you believe that passion and ability for science is evenly distributed among the sexes, then if you don’t have women, you have lost half of the best people,” states Hopkins. “Can we really afford to lose those top scientists?”
Yulian Aramovich Budagov, a world-class experimental physicist and veteran JINR researcher, passed away on 30 December. Born in Moscow on 4 July 1932, he graduated from the Moscow Engineering Physics Institute in 1956 and joined the staff of the Joint Institute for Nuclear Research (JINR), to where his lifelong scientific career was connected. He made a significant contribution to the development of large experimental facilities and achieved fundamentally important results, including: the properties of top quarks; the observation of new meson decay modes; measurements of CP-violating and rare-decay branching ratios; the determination of νN scattering form factors; observation of QCD colour screening; verification of the analytical properties of πр interaction amplitudes; and observation of scaling regularities in the previously unstudied field of multiple processes.
The exceptionally wide creative range of his activities was most prominently manifested during the preparation of experiments at TeV-scale accelerators. In 1991–1993 he initiated and directly supervised the cooperation of JINR and domestic heavy-industry enterprises for the Superconducting Super Collider, and in 1994 became involved in the preparation of experiments for the Tevatron at Fermilab and for the LHC, then under construction at CERN. He led the development of a culture using laser-based metrology for precision assembly of large detectors, and the meticulous construction of the large calorimetric complex for the ATLAS experiment. Budagov also devised a system of scintillation detectors with wave-shifting fibres for heavy-quark physics at the Tevatron’s CDF experiment, which helped measure the top-quark mass with a then-record accuracy. He was a leading contributor to JINR’s participation in the physics programme for the ILC, and initiated unique work on the use of explosion welding to make cryogenic modules for the proposed collider.
In his later years, Budagov focused on the development of next-generation precision laser metrology, which has promising applications such as the stabilisation of luminosity at future colliders and the prediction of earthquakes. Precision laser inclinometers developed under his supervision allowed the time dependence of angular oscillations of Earth’s surface to be measured with unprecedented accuracy in a wide frequency range, and are protected by several patents.
Yulian Aramovich Budagov successfully combined multifarious scientific and organisational activities with the training of researchers at JINR and in its member states. Based on the topics of research performed under his leadership, 60 dissertations were defended, 23 of which were prepared under his direct supervision. His research was published in major scientific journals of the Soviet Union, Russia, Western Europe and the US, and in proceedings of large international conferences. His works were awarded several JINR prizes, and he received medals of the highest order in Russia and beyond.
His memory will always remain in the hearts of all those who worked alongside him.
Thomas K Gaisser of the University of Delaware passed away on 20 February at the age of 81, after a short illness.
Tom was born in Evansville, Indiana, and graduated from Wabash College in 1962. He won a Marshall Scholarship that took him to the University of Bristol in the UK, where he received an MSc in 1965. He then went on to study theoretical particle physics at Brown University, receiving his PhD in 1967. After postdoctoral positions at MIT and the University of Cambridge, he joined the Bartol Research Institute in 1970, where his research interests tilted toward cosmic-ray physics.
Tom was a pioneer in gamma-ray and neutrino astronomy, and then in the emerging field of particle astrophysics. He was a master of extracting science from the indirect information collected by air-shower arrays and other particle astrophysics experiments. Early on, he studied the extensive air showers that are created when high-energy cosmic rays reach Earth. His contributions included the Gaisser–Hillas profile of longitudinal air showers and the Sybill Monte Carlo model for simulating air showers. He laid much of the groundwork for large experiments, such as Auger and IceCube, that provide high-statistics data on the high-energy particles that reach Earth, and for how that data can be used to probe fundamental questions in particle physics.
Tom’s work was also vital in interpreting data from lower-energy neutrino experiments, such as IMB and Kamioka. He provided calculations of atmospheric neutrino production that were important in establishing neutrino oscillations and, later, for searching for neutrino phenomena beyond the Standard Model.
Tom also contributed to experimental efforts. He was a key member of the Leeds–Bartol South Pole Air Shower Experiment (SPASE), which studied air showers as well as the muons these produce in the Antarctic Muon and Neutrino Detector Array (AMANDA). The combined observations were critical for calibrating AMANDA, and were important data for understanding the cosmic-ray composition. This work evolved into a leading role for Tom in the IceCube Neutrino Observatory, where he served as spokesperson between 2007 and 2011.
In IceCube, Tom focused on the IceTop surface array. Built, like SPASE, as a calibration tool and a veto-detector, its observations contributed to cosmic-ray physics covering a wide and unique energy range, from 250 TeV to EeV. It also made the first map of the high-energy cosmic-ray anisotropy in the Southern Hemisphere. Tom took to the task of building IceTop with gusto. For several summer seasons he travelled to Antarctica, staying there for weeks at a time to work on building the surface array, which consisted of frozen Auger-style water–Cherenkov detectors. He delighted in the hard physical labour and the camaraderie of everyone engaged in the project, from bulldozer drivers to his colleagues and their students. Tom became an ambassador of Antarctic science, in large part through a blog documenting his and his team’s expeditions to the South Pole.
Tom may be best known to physicists through his book Cosmic Rays and Particle Physics. Originally published in 1990, it was updated to a second edition in 2016, coauthored with Ralph Engel and Elisa Resconi. It sits on the shelves of researchers in the field around the globe.
Throughout his career, Tom received many scientific awards. He became a fellow of the American Physical Society in 1984 and was internationally recognised with the Humboldt Research Award, the O’Ceallaigh Medal and the Homi Bhabha Medal and Prize, among others. His Antarctic contributions were recognised when a feature on the continent was named Gaisser Valley.
Yes, having trained as a high-energy physics experimentalist with a focus on detector R&D, I joined ATLAS in 1998 and began working on the liquid-argon (LAr) calorimeter. I then got involved in the LAr calorimeter upgrade programme, when we were looking at the possible replacement of the on-detector electronics. I then served as leader for the trigger and data-acquisition upgrade project, before being elected as upgrade coordinator by the ATLAS collaboration in October 2018, with a two-year mandate starting in March 2019 and a second term lasting until February 2023. Because of the new appointment to the Neutrino Platform I will step down and enter a transition mode until around October.
What are the key elements of the ATLAS upgrade?
The full Phase-II upgrade comprises seven main projects. The largest is the new inner tracker, the ITk, which will replace the entire inner detector (Pixel, SCT and TRT) with a fully silicon detector (five layers of pixels and four of strip sensors) significantly extended in the forward region to exploit the physics reach at the High-Luminosity LHC. The ITk has been the most challenging project because of its technical complexity, but also due to the pandemic. Some components, such as the silicon-strip sensors, are already in production, and we are currently steering the whole project to complete pre-production by the end of the year or early 2023. The other projects include the LAr and the scintillating-tile calorimeters, the muons, trigger and data acquisition, and the high-granularity timing detector. The Phase-II upgrades are equivalent in scope to half of the original construction, and despite the challenges ATLAS can rely on a strong and motivated community to successfully complete the ambitious programme.
What are the stand-out activities during your term?
The biggest achievement is that we were able to redefine the scope of the trigger-systems upgrade. Until the end of 2020 we were planning a system based on a level-0 hardware trigger using calorimeter and muon information, followed by an event filter where tracks were reconstructed by associative memory-based processing units (HTT). The system had been designed to be capable of evolving into a dual-hardware trigger system with a level-0 trigger able to run up to 4 MHz, and the HTT system reconfigured as a level-1 track trigger to reduce the output rate to less than 1 MHz. We reduced this to one level by removing the evolution requirements and replacing the HTT processors with commodity servers. This was a complex and difficult process that took approximately two years to reach a final decision. Let me take this opportunity to express my sincere appreciation for those colleagues who carried the development of the HTT for many years: their contribution has been essential for ATLAS, even if the system was eventually not chosen. The main challenge of the ATLAS upgrade has been and will be the completion of the ITk in the available timescale, even after the new schedule for Long Shutdown 3.
What led you to apply for the position of Neutrino Platform leader?
Different factors, personal and professional. From a scientific point of view, I have been interested in LAr time-projection chambers (TPCs) for neutrino physics for many years, and in the challenge of scalability of the detector technology to the required sizes. Before being ATLAS upgrade coordinator, I had a small R&D programme at Brookhaven for developing LAr TPCs, and I worked for a couple years in the MicroBooNE collaboration on the electronics, which had to work at LAr temperatures. So, I have some synergetic work behind me. On a personal level, I’m obviously thrilled to formally become part of the CERN family. However, it has also been a difficult decision to move away from ATLAS, where I have spent more than 20 years collaborating with excellent colleagues and friends.
I am still planning to be hands-on – that is the fun part
What have been the platform’s main achievements so far?
Overall I would highlight the fact that the Neutrino Platform was put together in a very short time following the 2013 European strategy update. This was made possible by the leadership of my predecessor Marzio Nessi, a true force of nature, and the constant support of the CERN management. The refurbishment of ICARUS has been a significant technical success, as has the development and construction of the huge protoDUNE models for the two far detectors of LBNF/DUNE in the US.
What’s the status of the protoDUNE modules?
The first protoDUNE module based on standard horizontal-drift (“single phase”) technology has been successfully completed, with series production of the anode plane assembly starting now. Lately, the CERN group has contributed significantly to the vertical-drift concept, which is the baseline technology for the second DUNE far detector. This was initially planned to adopt “dual phase” detection but has now been adapted so that the full ionisation charge is collected in liquid-argon after a long vertical drift. Recently, before I came on board, the team demonstrated the ability to drift and collect ionisation charges over a distance of 6 m, which requires the high voltage to be extremely stable and the liquid-argon to be very pure to have enough charge collected to properly reconstruct the neutrino event. There is still work to be done but we have demonstrated that the technology is already able to reach the requirements. The full single-phase DUNE detector has to be closed and cooled down in 2028, and the second based on vertical drift in 2029. For an experiment at such scale, this is non-trivial.
What else is on the agenda?
The construction of the LBNF/DUNE cryostats is a major activity. CERN has agreed to provide two cryostats, which is a large commitment. The cryostat technology has been adapted from the natural-gas industry and the R&D phase should be completed soon, while we start the process of looking for manufacturers. We are also completing a project together with European collaborators involving the upgrade of the near detector for the T2K experiment in Japan, and are supporting other neutrino experiments closer to home, such as FASER at the LHC. Another interesting project is ENUBET, which has achieved important results demonstrating superior control of neutrino fluxes for cross-section measurements.
What are the platform’s long-term prospects?
One of the reasons I was interested in this position was to help understand and shape the long-term perspective for neutrino physics at CERN. The Neutrino Platform is a kind of tool that has a self-contained mandate. The question is whether and how it should or could continue beyond, say, 2027 and whether we will need to use the full EHN1 facility because we have other labs on-site to do smaller-scale tests for innovative detector R&D. Addressing these issues is one of my primary goals. There is also interest in Gran Sasso’s DarkSide experiment, which will use essentially the same cryostat technology as DUNE to search for dark matter. As well as taking care of the overall management and budget of the Neutrino Platform, I am still planning to be hands-on – that is the fun part.
What do you see as the biggest challenges ahead?
For the next two years the biggest challenge is the delivery of the two cryostats, which is both technical and subject to external constraints, for instance due to the increase in the costs of materials and other factors. From the management perspective, one has toacknowledge that the previous leadership created a fantastic team. It is relatively small but very motivated and competent, so it needs to be praised and maintained.
The ever maturing technology of silicon photomultipliers (SiPMs) has a range of advantages over traditional photomultiplier tubes (PMTs). As such, SiPMs are quickly replacing PMTs in a range of physics experiments. The technology is already included in the LHCb SciFi tracker and is foreseen to be used in CMS’ HGCAL, as well as in detectors at proposed future colliders. For these applications the important advantages of SiPMs over PMTs are their higher photo-detection efficiencies (by roughly a factor of two), their lower operating voltage (30-70 V compared to kV’s) and their small size, which allows them to be integrated in compact calorimeters. For space-based instruments — such as the POLAR-2 gamma-ray mission, which aims to use 6400 SiPM channels (see image) — a further advantage is the lack of a glass window, which gives SiPMs the mechanical robustness required during launch. There is, however, a disadvantage with SiPMs: dark current, which flows when the device is not illuminated and is greatly aggravated after exposure to radiation.
In order to strengthen the community and make progress on this technological issue, a dedicated workshop was held at CERN in a hybrid format from 25 to 29 April. Organized by the University of Geneva and funded by the Swiss National Science Foundation, the event attracted around 100 experts from academia and industry. The participants included experts in silicon radiation damage from the University of Hamburg who showed both the complexity of the problem and the need for further studies. Whereas the non-ionizing energy loss concept used to predict radiation damage in silicon is linearly correlated to the degradation of semiconductor devices in a radiation field, this appears to be violated for SiPMs. Instead, dedicated measurements for different types of SiPMs in a variety of radiation fields are required to understand the types of damage and their consequences on the SiPMs’ performance. Several such measurements, performed using both proton and neutron beams, were presented at the April workshop, while plans were made to coordinate such efforts in the future, for example by performing tests of one type of SiPMs at different facilities followed by identical analysis of the irradiated samples. In addition, an online platform to discuss upcoming results was established.
The lack of a glass window gives SiPMs the mechanical robustness required during launch
The damage sustained by radiation manifests itself mainly in the form of an increased dark current. As presented at the workshop, this increase can cause a vicious cycle because the increased current can cause self-heating, which further increases the highly temperature-dependent dark current. These issues are of great importance for future space missions as they influence the power budget, causing the scientific performance to degrade over time. Data from the first SiPM based in-orbit detectors, such as the SIRI mission by the US Naval Research Lab, the Chinese-led GECAM and GRID detectors and the Japanese-Czech GRBAlpha payload, were presented. It is clear that although SiPMs have advantages over PMTs, the radiation, which is highly dependent on the satellite’s orbit, can cause a significant degradation in performance that limits low-earth orbit missions to several years in space. Based on these results, a future Moon mission has decided against the use of SiPMs and reverted to PMTs.
Solutions to radiation damage in SiPMs were also discussed at length. These are mainly in the form of speeding up the annealing of the damage by exposing SiPMs to hotter environments for short periods. Additionally, cooling of the SiPM during data taking will not only decrease the dark current directly, but could also reduce the radiation damage itself, although further research on this topic is required.
Overall, the workshop indicated significant further studies are required to predict the impact of radiation damage on future experiments.