Open science has become a pillar of the policies of national and international research-funding bodies. The ambition is to increase scientific value by sharing data and transferring knowledge within and across scientific communities. To this end, in 2015 the European Union (EU) launched the European Open Science Cloud (EOSC) to support research based on open-data science.
To help European research infrastructures adapt to this future, in 2019 the domains of astrophysics, nuclear and particle physics joined efforts to create an open scientific analysis infrastructure to support the principles of data “FAIRness” (Findable, Accessible, Interoperable and Reusable) through the EU Horizon 2020 project ESCAPE (European Science Cluster of Astronomy & Particle physics ESFRI research infrastructures). The ESCAPE international consortium brings together ESFRI projects (CTA, ELT, EST, FAIR, HL-LHC, KM3NeT and SKA) and other pan-European research infrastructures (RIs) and organisations (CERN, ESO, JIVE and EGO), linking them to EOSC.
Launched in February 2019, the €16M ESCAPE project recently passed its mid-point, with less than 24 months remaining to complete the work programme. Several milestones have already been achieved, with much more in store.
Swimming in data ESCAPE has implemented the first functioning pilot ‘Data Lake’ infrastructure, which is a new model for federated computing and storage to address the exabyte-scale of data volumes expected from the next generation of RIs and experiments. The Data Lake consists of several components that work together to provide a unified namespace to users who wish to upload, download or access data. Its architecture is based on existing and proven technologies: the Rucio platform for data management; the CERN-developed File Transfer Service for data movement and transfer; and connection to heterogenous storage systems in use across scientific data centres. These components are deployed and integrated in a service that functions seamlessly regardless of which RI the data belong to.
ESCAPE aims to deploy an integrated open “virtual research environment”
The Data Lake is an evolution of the current Worldwide LHC Computing Grid model for the advent of HL-LHC. For the first time, thanks to ESCAPE, it is the product of a cross-domain and cross-project collaboration, where scientists from HL-LHC, SKA, CTA, FAIR and others co-develop and co-operate from the beginning. The first data orchestration tests have been successfully accomplished, and the pilot phase demonstrated a robust architecture that serves the needs and use-cases of the participant experiments and facilities. Monitoring and dashboard services have enabled user access and selection of datasets. A new data challenge also including scientific data-analysis workflows in the Data Lake is planned for later this year.
ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services. It will house software and services for data processing and analysis, as well as test datasets of the partner ESFRI projects, and provide user-support documentation, tutorials, presentations and training.
Open software The collaborative, open-innovation environment and training actions provided by ESCAPE have already enabled the development of original open-source software. High-performance programming methods and deep-learning approaches have been developed, benchmarked and in some cases included in the official analysis pipelines of partner RIs. Definition of data formats has been pursued as well as the harmonisation of approaches for innovative workflows. A common meta-data description of the software packages, community implementation based on an available standard (CodeMeta) and standard guidelines (including licensing) for the full software development lifecycles have been gathered to enable interoperability and re-use.
Following the lead of the HEP Software Foundation (HSF), the community-based foundation of ESCAPE embraces a large community. Establishing a cooperative framework with the HSF will enable HSF packages to be added to the ESCAPE catalogue, and to align efforts.
From the user-access point of view, ESCAPE aims to build a prototype ‘science analysis platform’ that supports data discovery and integration, provides access to the repository, enables user-customised processing and workflows, interfaces with the underlying distributed Data Lake and links to existing infrastructures such as the Virtual Observatory. It also enables researchers’ participation in large citizen-powered research projects such as Zooniverse. Every ESFRI project customizes the analysis platform for their own users on top of some common lower-level services such as JupyterHub, a pre-defined Jupyter Notebook environment and Kubernetes deployment application that ESCAPE is building. First prototypes are under evaluation for SKA, CTA and for the Vera C. Rubin Observatory.
In summary, ESCAPE aims to deploy an integrated open “virtual research environment” through its services for multi-probe data research, guaranteeing and boosting scientific results while providing a mechanism for acknowledgement and rewarding of researchers committing to open science. In this respect, together with four other thematic clusters (ENVRI-Fair, EOSC-Life, PANOSC and SSHOC), ESCAPE is partner of a new EU funded project ‘EOSC Future’ which aims to gather the efforts of more researchers in some cross-domain open-data ‘Test Science Projects’ (TSP). TSPs are collaborative projects, including two named Dark Matter and Extreme Universe, in which data, results and potential discoveries from a wealth of astrophysics, particle-physics and nuclear-physics experiments, combined with theoretical models and interpretations, will increase our understanding of the universe. This requires the engagement of all scientific communities, as already recommended by the 2020 update of the European Strategy for Particle Physics.
Open-data science projects In particular, the Dark Matter TSP aims at further understanding the nature of dark matter by performing new analyses within the experiments involved, and collecting all the digital objects related to those analyses (data, metadata and software) on a broad open-science platform that will allow these analyses to be reproducible by the entire community wherever possible.
The Extreme Universe TSP, meanwhile, intends to develop a platform to enable multi-messenger/multi-probe astronomy (MMA). There are many studies of transient astrophysical phenomena that benefit from the combined use of multiple instruments at different wavelengths and different probe types. Many of these are based on the trigger of one instrument generating follow-ups from others at different timescales, from seconds to days. Such observations could lead to images of strong gravitational effects that are expected near a black hole, for example. Extreme energetic astrophysical pulsing phenomena such as gamma-ray bursts, active galactic nuclei and fast radio bursts are also high-energy phenomena not yet fully understood. The intention within ESCAPE is to build such a platform for MMA science in such a way as to make it sustainable.
ESCAPE is also setting up a sustainable open-access repository for deployment, exposure, preservation and sharing of scientific software and services
The idea in both of these TSPs is to exploit for validation purposes all the prototype services developed by ESCAPE and the uptake of its virtual research environment. At the same time the TSPs aim to promote the innovative impact of data analysis in open science, validate the reward scheme acknowledging scientists’ participation, and demonstrate the increased scientific value implied by sharing data. This approach was discussed at the last JENAS 2019 workshop and will be linked to two homologue joint ECFA-NuPECC-APPEC actions (iDMEu and gravitational-wave probes of fundamental physics).
Half-way through, ESCAPE is clearly proving itself as a powerful catalyst to make the world’s leading research infrastructures in particle physics and astronomy as open as possible. The next two years will see the consolidation of the cluster programme and the inclusion of further world-class RIs in astrophysics, nuclear and particle physics. Through the TSPs and further science projects, the ESCAPE community will continue to engage in building within EOSC the open-science virtual research environment of choice for European researchers. In the even longer term, ESCAPE and the other science clusters are exploring how to evolve into sustained “Platform Infrastructures” federating large domain-based RIs. The platforms would operate to study, define and set up a series of new focuses around which they engage with the European Commission and national research institutes to take part in the European data strategy at large.
The Higgs boson was hypothesised to explain electroweak symmetry breaking nearly 50 years before its discovery. Its eventual discovery at the LHC took half a century of innovative accelerator and detector development, and extensive data analysis. Today, several outstanding questions in particle physics could be answered by higgsinos – theorised supersymmetric partners of an extended Higgs field. The higgsinos are a triplet of electroweak states, two neutral and one charged. If the lightest neutral state is stable, it can provide an explanation of astronomically observed dark matter. Furthermore, an intimate connection between higgsinos and the Higgs boson could explain why the mass of the Higgs boson is so much lighter than suggested by theoretical arguments. While higgsinos may not be much heavier than the Higgs boson, they would be produced more rarely and are significantly more challenging to find, especially if they are the only supersymmetric particles near the electroweak scale.
Higgsinos mix with other supersymmetric electroweak states, the wino and the bino, to form the physical particles that would be observed
The ATLAS collaboration recently released a set of results based on the full LHC Run 2 dataset that explore some of the most challenging experimental scenarios involving higgsinos. Each result tests different assumptions. Owing to quantum degeneracy, the higgsinos mix with other supersymmetric electroweak states, the wino and the bino, to form the physical particles that would be observed by the experiment. The mass difference between the lightest neutral and charged states, ∆m, depends on this mixing. Depending on the model assumptions, the phenomenology varies dramatically, requiring different analysis techniques and stimulating the development of new tools.
If ∆m is only a few hundred MeV, the small phase space suppresses the decay from the heavier states to the lightest one. The long-lived charged state flies partway through the inner tracker before decaying, and its short track can be measured. A search targeting this anomalous “disappearing track” signature was performed by exploiting novel requirements on the quality of the signal candidate and the ability of the ATLAS inner detectors to reconstruct short tracks. Finding that the number of short tracks is as expected from background processes alone, this search rules out higgsinos with lifetimes of a fraction of a nanosecond for masses up to 210 GeV.
If higgsinos mix somewhat with other supersymmetric electroweak states, they will decay promptly to the lightest stable higgsino and low-energy Standard Model particles. These soft decay products are extremely challenging to detect at the LHC, and ATLAS has performed several searches for events with two or three leptons to maximise the sensitivity to different values of ∆m. Each search features innovative optimisation and powerful discriminants to reject background. For the first time, ATLAS has performed a statistical combination of these searches, constraining higgsino masses to be larger than 150 GeV for ∆m above 2 GeV.
A final result targets higgsinos in models in which the lightest supersymmetric particle is not stable. In these scenarios, higgsinos may decay to triplets of quarks. A search designed around an adversarial neural network and employing a completely data-driven background estimation technique was developed to distinguish these rare decays from the overwhelming multi-jet background. This search is the first at the LHC to obtain sensitivity to this higgsino model, and rules out scenarios of the pair production of higgsinos with masses between 200 and 320 GeV (figure 1).
Together, these searches set significant constraints on higgsino masses, and for certain parameters provide the first extension of sensitivity since LEP. With the development of new techniques and more data to come, ATLAS will continue to seek higgsinos at higher masses, and to test other theoretical and experimental assumptions.
On 1 May, experimental astroparticle physicist Ignacio Taboada of the Georgia Institute of Technology began his two-year term as spokesperson of the IceCube collaboration. He replaces Darren Grant who has served as spokesperson for the South Pole neutrino observatory since 2019, during which time the collaboration made the first measurements of tau-neutrino appearance with IceCube DeepCore and reported the first observation of high-energy astrophysical tau neutrinos.
Taboada currently leads a research group at the Center for Relativistic Astrophysics at Georgia Tech, which has made significant contributions to IceCube by using data to search for neutrinos from transient sources, including blazar flares. Among his goals as spokesperson is to help consolidate the potential future of IceCube, IceCube-Gen2 – a proposed $350M upgrade that would increase the annual rate of cosmic neutrino observations by an order of magnitude, while increasing the sensitivity to point sources by a factor of five.
I want to make sure that everybody that is related to IceCube in one way or another feels welcome
Ignacio Taboada
“IceCube was initially conceived to study astrophysical neutrinos and to search for the sources of astrophysical neutrinos. However, the breadth of science that it can do in other areas — glaciology, cosmic rays, PeV gamma ray sources, searches for dark matter, etc. — has allowed IceCube to produce really good scientific results for a decade or longer,” says Taboada. “Because Gen2 is standing on similar premises, I think it has a really bright future.”
Another goal is to make every IceCube member feel welcome, he explains. “There are 350 authors whose names go into papers, but I want to make sure that everybody that is related to IceCube in one way or another feels welcome within IceCube. When I joined the AMANDA collaboration, the predecessor of IceCube, in the late 1990s it was maybe 25 people. Now that it’s a gigantic enterprise, it is very easy, for example, for new PhD students to feel intimidated by professors, the analysis coordinator, the spokesperson. That’s not what I want—what I want is for everybody to feel welcome, because every single one of these people has tremendous potential to contribute to the experiment.”
Theorist Tord Riemann, who made key contributions to e+e– collider phenomenology, left us on 2 April.
Tord was born in 1951 in East Berlin, educated at the Heinrich-Hertz-Gymnasium specialist mathematics school in Berlin and studied physics at Humboldt University in Berlin from 1970. He graduated in 1977 with a doctorate devoted to studies of the lattice approach to quantum field theory. He obtained a research position in the theory group of the Institute of High Energy Physics of the Academy of Sciences of the GDR in Zeuthen (later DESY Zeuthen), and in 1983–1987 worked at JINR, then in the Soviet Union, in the group of Dmitry Bardin.
In 1989/1990 Tord visited the L3 experiment at CERN, starting a fruitful collaboration on the application of the ZFITTER project at the Large Electron–Positron (LEP) collider. In 1991–1992 he was a research associate in the CERN theory division, working out the so-called S-matrix approach to the Z resonance. This was a profound contribution to the field, and a breakthrough for the interpretation of LEP data. Tord was one of the first to realise the great potential of a new e+e– “Tera-Z” factory at the proposed Future Circular Collider, FCC-ee, and led the charge reviving precision calculations for it.
Tord’s scientific fields of interest were broad
Tord’s scientific fields of interest were broad, and aimed at predicting observables measured at accelerators. His research topics included linear-collider physics; Higgs, WW, ZZ, 2f and 4f production in e+e– scattering; physics at LEP and FCC-ee; methods in the calculation of multi-loop massive Feynman integrals; NNLO Bhabha scattering in QED; higher-order corrections in the electroweak Standard Model and some extensions; and electroweak corrections for deep inelastic scattering at HERA. Apart from ZFITTER, he co-authored several programmes, including topfit, GENTLE/4fan, HECTOR, SMATASY, TERAD91, DISEPNC, DISEPCC, DIZET, polHeCTOR and AMBRE.
While being an active research scientist throughout his career, Tord will also be warmly remembered as a great mentor to many of us. He was a thesis advisor for two diploma and seven PhD students, and was actively engaged in supporting many postdoctoral researchers. He was co-founder and organiser of the bi-annual workshop series Loops and Legs in Quantum Field Theory and of the biannual DESY school Computer Algebra and Particle Physics.
In 2000, Tord and the ZFITTER collaboration were awarded the First Prize of JINR, and in 2014 the article “The ZFITTER Project” was awarded the JINR prize for the best publication of the year in Physics of Elementary Particles and Nuclei. In 2015 Tord was awarded an Alexander von Humboldt Polish Honorary Research Fellowship.
Tord Riemann cared about high standards in scientific research, including ethical issues. He was a true professional of the field. Despite illness, he continued working until his last day.
Tord was an outstanding scientist, a just person of great honesty, a reliable friend, colleague and family man. We feel a great loss, personally and as a scientific community, and remain thankful for his insights, dedication and all the precious moments we have shared.
After many years of research and development, the ALPHA collaboration has succeeded in laser-cooling antihydrogen – opening the door to considerably more precise measurements of antihydrogen’s internal structure and gravitational interactions. The seminal result, reported on 31 March in Nature, could also lead to the creation of antimatter molecules and the development of antiatom interferometry, explains ALPHA spokesperson Jeffrey Hangst. “This is by far the most difficult experiment we have ever done,” he says. “We’re over the moon. About a decade ago, laser cooling of antimatter was in the realm of science fiction.”
The ALPHA collaboration synthesises antihydrogen from cryogenic plasmas of antiprotons and positrons at CERN’s Antiproton Decelerator (AD), storing the antiatoms in a magnetic trap. Lasers with particular frequencies are then used to measure the antiatoms’ spectral response. Finding any slight difference between spectral transitions in antimatter and matter would challenge charge–parity–time symmetry, and perhaps cast light on the cosmological imbalance of matter and antimatter.
Historically, researchers have struggled to laser-cool normal hydrogen, so this has been a bit of a crazy dream for us for many years.
Makoto Fujiwara
Following the first antihydrogen spectroscopy by ALPHA in 2012, in 2017 the collaboration measured the spectral structure of the antihydrogen 1S–2S transition with an outstanding precision of 2 × 10–12 – marking a milestone in the AD’s scientific programme. The following year, the team determined antihydrogen’s 1S–2P “Lyman–alpha” transition with a precision of a few parts in a hundred million, showing that it agrees with the prediction for the equivalent transition hydrogen to a precision of 5 × 10–8. However, to push the precision of spectroscopic measurements further, and to allow future measurements of the behaviour of antihydrogen in Earth’s gravitational field, the kinetic energy of the antiatoms must be lowered.
In their new study, the ALPHA researchers were able to laser-cool a sample of magnetically trapped antihydrogen atoms by repeatedly driving the antiatoms from the 1S to the 2P state using a pulsed laser with a frequency slightly below that of the transition between them. After illuminating the trapped antiatoms for several hours, the researchers observed a more than 10-fold decrease in their median kinetic energy, with many of the antiatoms attaining energies below 1 μeV. Subsequent spectroscopic measurements of the 1S–2S transition revealed that the cooling resulted in a spectral line about four times narrower than that observed without laser cooling – a proof-of-principle of the laser-cooling technique, with further statistics needed to improve the precision of the previous 1S–2S measurement (see figure).
“Historically, researchers have struggled to laser-cool normal hydrogen, so this has been a bit of a crazy dream for us for many years,” says Makoto Fujiwara, who proposed the use of a pulsed laser to cool trapped antihydrogen in ALPHA. “Now, we can dream of even crazier things with antimatter.”
CERN’s international relationships are central to its work, and a perfect example of nations coming together for the purpose of peaceful research, regardless of external politics. Through working in China during the 1980s and the Soviet Union/Russia in the early 1990s, physicist Paul Lecoq’s long career is testament to CERN’s influence and standing.
Originally interested in astrophysics, Lecoq completed a PhD in nuclear physics in Montreal in 1972. After finishing his military service, during which he taught nuclear physics at the French Navy School, he came across an advertisement for a fellowship position at CERN. It was the start of a 47-year-long journey with the organisation. “I thought, why not?” Lecoq recalls. “CERN was not my initial target, but I thought it would be a very good place to go. Also, I liked skiing and mountains.”
Royal treatment
During his third year as a fellow, a staff position opened for the upcoming European Hybrid Spectrometer (EHS), which would test CERN’s potential for collaboration beyond its core member states. “The idea was to make a complex multi-detector system, which would be a multi-institute collaboration, with each institute having the responsibility to build one detector,” says Lecoq. One of these institutes was based in Japan, allowing the exchange of personnel. Lecoq was one of the first to benefit from this agreement and, thanks to CERN’s already substantial image, he was very well-received. “At the time, people were travelling much less than now, and Japan was more isolated. I was welcomed by the president of the university and had a very nice reception almost every day.” It was an early sign of things to come for Lecoq.
During the lifetime of the EHS, a “supergroup” of CERN staff was formed whose main role was to support partners across the world while also building part of the experiment. By the time the Large Electron–Positron Collider (LEP) came to fruition it was clear that it would also benefit from this successful approach. At that time, Sam Ting had been asked to propose an experiment for LEP by then Director-General Herwig Schopper, which would become the L3 experiment, and with the EHS coming to an end, says Lecoq, it was natural that the EHS supergroup was transferred to Ting. Through friends working in material science, Lecoq caught wind of the new scintillator crystal (BGO) that was being proposed for L3 – an idea that would see him link up with Ting and spend much of the next few years in China.
BGO crystals had not yet been used in particle physics, and had only existed in a few small samples, but L3 needed more than 1 m3 of coverage. After sampling and testing the first crystal samples, Lecoq presented his findings at an L3 collaboration meeting. “At the end of the meeting, Ting pointed his finger in my direction and asked if I was free on Saturday. I responded, ‘yes sir’. Then he turned to his secretary and said, ‘book a flight ticket to Shanghai – this guy is coming with me!’”
This is something unique about CERN, where you can meet fantastic people that can completely change your life
Unknown to Lecoq upon his arrival in China, Ting had already prepared the possibility to develop the technology for the mass production of BGO crystals there, and wanted Lecoq to oversee this production. BGO was soon recognised as a crystal that could be produced in large quantities in a reliable and cost-effective way, and it has since been used in a generation of PET scanners. Lecoq was impressed by the authority Ting held in China. “The second day we were in China, we, well Ting, had been invited by the mayor of Shanghai for a dinner to discuss the opportunity for the experiment.” The mayor was Jiang Zemin, who only a few years later became China’s president. “I have been very lucky to have several opportunities like this in my career. This is something unique about CERN, where you can meet fantastic people that can completely change your life. It was also an interesting period when China was slowly opening up to the world – on my first trip everyone was in Mao suits, and in the next three to five years I could see a tremendous change that was so impressive.”
Lecoq’s journeyman career did not stop there. With LEP finishing towards the turn of the millennium and LHC preparations in full swing, his expertise was needed for the production of lead tungstate (PWO) crystals for CMS’s electromagnetic calorimeter. This time, however, Russia was the base of operations, and the 1.2 m3 of BGO crystal for L3 became more than 10 m3 of PWO for CMS. As with his spell in China, Lecoq was in Russia during a politically uncertain time, with his arrival shortly following the fall of the Berlin Wall. “There was no system anymore. But there was still very strong intellectual activity, with scientists at an incredible level, and there was still a lot of production infrastructure for military interest.”
It was interesting not only at the scientific level, but on a human level too
At the time, lithium niobate, a crystal very similar to PWO, was being exploited for radar communication and missile guidance, says Lecoq, and the country had a valuable (but unknown to the public) production-infrastructure in place. With the disarray at the end of the Cold War, the European Commission set up a system, along with Canada, Japan and the US, called the International Science and Technology Center (ISTC), whose role was to transfer the Soviet Union’s military industry into civil application. Lecoq was able to meet with ISTC and gain €7 million in funding to support PWO crystal production for CMS. Again, he stresses, this only happened due to the stature of CERN. “I could not have done that if I had been working only as a French scientist. CERN has the diplomatic contact with the European Commission and different governments, and that made it a lot easier.” Lecoq was responsible for choosing where the crystal production would take place. “These top-level scientists working in the military areas felt isolated, especially in a country that was in a period of collapse, so they were more than happy not only to have an opportunity to do their job under better conditions, but also to have the contacts. It was interesting not only at the scientific level, but on a human level too.”
Crystal clear
Back at CERN, Lecoq realised that introducing a new scintillating crystal, optimising its performance to the harsh operating conditions of the LHC, and developing mass-production technologies to produce large amounts of crystal in a reliable and cost-effective way, was a formidable challenge that could not be dealt with only by particle physicists. Therefore, in 1991, he decided to establish the Crystal Clear multidisciplinary collaboration, gathering experts in material science, crystal-growth, luminescence, solid-state physics and beyond. Here again, he says, the attractiveness of CERN as an internationally recognised research centre was a great help to convince institutes all over the world, some not connected to particle physics at all, to join the collaboration. Crystal Clear is still running today, and celebrating its 30th anniversary.
Through developing international connections in unexpected places, Lecoq’s career has helped build sustained connections for CERN in some of the world’s largest and fruitfully scientific places. Now retired, he is a distinguished professor at the Polytechnic University in Valencia, where he has set up a public–private partnership laboratory for metamaterial-based scintillators and photodetectors, to aid a new generation of ionisation radiation detectors for medical imaging and other applications. Even now, he is able to flex the muscles of the CERN model by keeping in close contact with the organisation.
“My career at CERN has been extremely rich. I have changed so much in the countries I’ve worked with and the scientific aspect, too. It could only have been possible at CERN.”
Somewhat accidentally, because I didn’t even know that being a researcher in physics was a thing you could be until my second year of university. It was around then that I realised that someone like me could ask questions that didn’t have answers. That hooked my interest. My first project was in nuclear physics, and it involved using a particle accelerator for an experiment. I then attended the CERN summer student programme, working on ATLAS, which was my first proper exposure to the technology of particle physics. When it came to the time to do my PhD in around 2006, I had the choice to either stay in Melbourne to do particle physics, or go to Oxford, which had a strong accelerator programme. When I learned they were designing accelerators for cancer treatment, it blew my mind! I took the leap and decided to move to the other side of the world.
What did you do as a postdoc?
I was lucky enough to get an 1851 Royal Commission Fellowship, which allowed me to start an independent research programme. It was a bit of a baptism of fire, as I had been working on medical machines but then moved to high-intensity proton accelerators. I was looking at fixed-field alternating gradient accelerators and their application to things like accelerator-driven reactors. After a while I found myself spending a lot of time building sophisticated simulations, and was getting a bit bored of computing. So I started a couple of collaborations with some teams in Japan – one of which was using ion traps to mimic the dynamics of particle beams at very high intensity. What I found really interesting is how beams behave at a fundamental level, and I am currently working on upgrading a small experiment called IBEX to test a new type of optics called non-linear integral optics, which is a focus of Fermilab at the moment.
And now you’re back in the medical arena?
Yes – a few years ago I started working with people from CERN and the UK on compact medical accelerators for low- and middle-income countries. Then in 2019 I felt the pull to return to Australia to grow accelerator physics there. They have accelerators and facilities but didn’t have a strong academic accelerator community, so I am building up a group at Melbourne University that has a medical applications focus, but also looks at other areas. After 20 years of pushing for a proton therapy centre here, the first one is now being built.
How and when did your career in science communication take off?
I was doing things like stage shows for primary-school children when I was a first-year undergraduate. I have always seen it as part of the process of being a scientist. Before my PhD I worked in a science museum and, while at Oxford, I started an outreach programme called Accelerate! that took live science shows to some 30,000 students in its first two years and is still running. From there, it sort of branched out. I did more public lectures, but also a little bit of TV, radio and some writing.
Any advice for physicists who want to get into communication?
You need to build a portfolio, and demonstrate that you have a range of different styles, delivery modes and use language that people understand. The other thing that really helped me was working with professional organisations such as the Royal Institution in London. It does take a lot of time to do both your research and academic job well, and also do the communication well. A lot of my communication is about my research field – so luckily they enrich each other. I think my communication has the potential to have a much bigger societal impact than my research, so I am very serious about it. The first time someone pointed a video camera at me I was terrified. Now I can say what I want to say. We shouldn’t underestimate how much the public wants to hear from real working scientists, so keeping a very strong research base keeps my authenticity and credibility.
What is your work/life balance like?
I am not a fan of the term “work/life balance” as it tends to imply that one is necessarily in conflict with the other. I think it’s important to set up a kind of work/life integration that supports well-being while allowing you to do the work you want to do. When I was invited back to Melbourne to build an accelerator group, I’d just started a new research group in Oxford. I stepped down my teaching and we agreed that I would take periods of sabbatical to spend time in Melbourne until I finished my experiment. I have been so incredibly grateful to everyone on both sides for their understanding. Key to that has been learning how other people’s expectations affect you and finding a way to filter them out and drive your own goals. Working in two completely different time zones, it would be easy to work ridiculously long days, so I have had to learn to protect my health. The hardest thing, and I think a lot of early/mid-career researchers will relate to this, is that academia is an infinite job: you will never do enough for someone to tell you that you have done enough. The pressure always feels like it’s increasing, especially when you are a post-doc or on tenure track, or in the process of establishing a new group or lab. You have to learn how to take care of your mental health and well-being so that you don’t burn out. With everything else that’s going on in the world right now, this is even more important.
You are active in trying to raise the profile of women in physics. What does this involve on a practical level?
There has been a lot of focus for many years in getting more women into subjects like physics. My view is that whenever I meet young people they’re interested already. In many countries the gender balance at undergraduate level is similar. So what’s happening instead is that we are pushing women and minorities out. My focus, within my sphere of influence, is to make sure that the culture that I am perpetuating and the values that I hold within my research groups are well defined and communicated.
I kind of pulled back from active engagement in panel sessions and things like that a number of years ago, because I realised that the most important way I can contribute is by being the best scientist that I can be. The fact that I happen to have a public profile is great in that it makes people aware that people like me exist. One of the things that has helped me the most is to build a really great community of peers of other women in physics. I think for the first seven or eight years of my career, when imposter syndrome was strong and I questioned if I fitted in, I realised that I didn’t have a single direct female colleague. With most people in my field being men, it’s likely that when choosing a speaker, for example, the first person we think of is male. Taking time to be well-networked with women in the field is incredibly important in that regard. Today, I find that creating the right environment means that people will seek out my research group because they hear it’s a nice place to be. Students today are much savvier with this stuff – they can tell toxic professors a mile away. I am trying to show them that there is a way of doing research that doesn’t involve the horrible sides to it. Research is hard enough already, so why make it harder?
Tell us about your debut book The Matter of Everything?
It’s published by Bloomsbury (UK/Commonwealth) and Knopf (US) and is due out in early 2022. Its subtitle is “The 12 experiments that made the modern world”, starting with the cathode-ray tube and going all the way through to the LHC and what might come next. It’s told from the perspective of an experimental physicist. What isn’t always captured in popular physics books is how science is actually done, but it’s very human to feel like you’re failing in the lab. I also delve into what first interested me in accelerators, specifically the things that have emerged unexpectedly from these research areas. People think that Apple invented everything in the iPhone, but if it wasn’t for curiosity-driven physics experiments then it wouldn’t be possible. On a personal note, as I went through these stories in the field, often in the biographies and the acknowledgments, I would end up going down these rabbit holes of women whose careers were cut short because they got married and had to quit their job. It’s been lovely to have the opportunity to learn that these women were there, and it wasn’t just white men.
You have to learn how to take care of your mental health and well-being so that you don’t burn out
Do you have a preference as to which collider should come next after the LHC?
I think it should be one of the linear ones. The size of future circular colliders and the timescales involved are quite overwhelming, and you have to wonder if the politics might change throughout the project. A linear machine such as the ILC is more ready to go, if the money and will was there. But I also think there is value in the diversity of the technology. The scaling of SLAC’s linear electron machine, for example, really pushed the industrialisation of that accelerator technology – which is part of the reason why we have 3 GHz electron accelerators now in every hospital. There will be other implications to what we build, other than physics results – even though the decisions will be made on the physics.
What do you say to students considering a career in particle physics?
I will answer that from the perspective of the accelerator field, which is very exciting. If you look historically, new technologies have always driven new discoveries. The accelerator field is going through an interesting “technology discovery phase”, for example with laser-driven plasma accelerators, so there will be huge changes to what we are doing in 10–15 years’ time that could blow the decisions surrounding future colliders out of the water. This happened in the 1960s in the era of proton accelerators, where suddenly there was a new technology and it meant you could build machines with a much higher energy with smaller magnets, and suddenly the people who took that risk were the ones who ended up pushing the field forward. I sometimes feel experimental and theoretical physicists are slightly disconnected to what’s going on with accelerator physics now. When making future decisions, people should attend accelerator conferences…it may influence their choices.
The Skobeltsyn Institute of Nuclear Physics (SINP) was established at Lomonosov Moscow State University (MSU) on 1 February 1946, in pursuance of a decree of the government of the USSR. SINP MSU was created as a new type of institute, in which the principles of integrating higher education and fundamental science were prioritised. Its initiator and first director was Soviet physicist Dmitri Vladimirovich Skobeltsyn, who was known for his pioneering use of the cloud chamber to study the Compton effect in 1923 – aiding the discovery of the positron less than a decade later.
It is no coincidence that SINP MSU was established in the immediate aftermath of the Second World War, following the first use of nuclear weapons in conflict. The institute was created on the basis that it would train personnel who would specialise in nuclear science and technology, after the country realised that there was a shortage of specialists in the field. Thanks to strong leadership from Skobeltsyn and one of his former pupils, Sergei Nikolaevich Vernov, SINP MSU quickly gained recognition in the country. As soon as 1949, the government designated it a leading research institute. By this time a 72 cm cyclotron was already in use, the first to be used in a higher education institute in the USSR.
Skobeltsyn and Vernov continued with their high ambitions as they expanded the facility to the Lenin Hills, along with other scientific departments in MSU. Proposed in 1949 and opened in 1953, the new building in Moscow was granted approval to build a set of accelerators and a special installation for studying extensive air showers (EASs). The first accelerator built there was a 120 cm cyclotron, and its first outstanding scientific achievement was the discovery by A F Tulinov of the so-called “shadow effect” in nuclear reactions on single crystals, which makes it possible to study nuclear reactions at ultra-short time intervals. Significant scientific successes were associated with the commissioning of a unique installation, the EAS-MSU, at the end of the 1950s for the study of ultra-high-energy cosmic rays. Several results were obtained through a new method for studying EASs in the region of 1015–1017 eV, leading to the discovery of the famous “knee” in the energy spectrum of primary cosmic rays.
The space race
1949 marked SINP MSU’s entrance into astrophysics and, in particular, satellite technology. The USSR’s launch of Sputnik 1, Earth’s first artificial satellite, in 1957 gave Vernov, an enthusiastic experimentalist who had previously researched cosmic rays in the Earth’s stratosphere, the opportunity to study outer-atmosphere cosmic rays. This led to the installation of a Geiger counter on the Sputnik 2 satellite and a scintillation counter on Sputnik 3, to enable radiation experiments. Vernov’s experiments on Sputnik 2 enabled the first detection of the outer radiation belt. However, this was not confirmed until 1958 by the US’s Explorer 1, which carried an instrument designed and built by James Van Allen. Sputnik 3 confirmed the existence of an inner radiation belt, having received information from Australia and South America, as well as from sea-based stations.
Vernov, who was Skobeltsyn’s successor as SINP director in 1960–1982, later worked on the “Electron” and “Proton” series of satellites, which studied the radiation-belt structure, energy spectra and temporal variations associated with geomagnetic activity. This led to pioneering results on the spectrum and composition of galactic cosmic rays, and to the first model of radiation distribution in near-Earth space in the USSR.
SINP MSU has carried on Vernov’s cosmic legacy by continuing to develop equipment for satellites. Since 2005 the institute has developed its own space programme through the university satellites Tatiana-Universitetsky and Tatiana-2, as well as the Vernov satellite. These satellites led to new phenomena such as ultraviolet flashes from the atmosphere being discovered. In 2016 a tracking system for ultraviolet rays was installed on board the Lomonosov satellite (see “Vernov’s legacy” image), developed at SINP MSU under the guidance of former director Mikhail Igorevich Panasyuk. This allowed fluorescence light radiated by EASs of ultra-high-energy cosmic rays to be measured for the first time, and prompt-emission observations of multi-wavelength gamma-ray bursts. The leading role of the entire mission of the Lomonosov satellite belongs to the current rector of MSU, Victor Sadovnichy.
High-energy exploration
In 1968, under strong endorsement by Vernov and the director of a new Russian accelerator centre in Protvino, Anatoly Alekseyevich Longunov (who went on to be MSU rector from 1977 to 1991), a department of high-energy physics was established under the leadership of V G Shevchenko at SINP MSU, and the following year it was decided that a high-energy laboratory would be established at MSU. Throughout the years to follow, collaborations with laboratories in USSR and across the world, including CERN, Fermilab, DESY and the Joint Institute for Nuclear Research (JINR), lead the department to be at the forefront of the field.
At the end of the 1970s a centre was created at SINP MSU for bubble-chamber film analysis. At the time it was one of the largest automated complexes for processing and analysing information from large tracking detectors in the country. In collaboration with other institutes worldwide, staff at the institute studied soft hadronic processes in the energy range 12–350 GeV at a number of large facilities, including the Mirabelle Hydrogen Bubble Chamber and European Hybrid Spectrometer.
Extensive and unique experimental data have been obtained on the characteristics of multiple hadron productions, including fragmentation distributions. Throughout the years, exclusive reaction channels, angular and momentum correlations of secondary particles, resonance production processes and annihilation processes were also investigated. These results have made it possible to reliably test the predictions of phenomenological models, including the dual-parton model and the quark–gluon string model, based on the fundamental theoretical scheme of dual-topological unitarisation.
For the first time in Russia, together with a number of scientific and technical enterprises with the leading role of the SINP MSU, an integrated system has now been created for the development, design, mass production and testing of large silicon solid and microstrip detectors. On this basis, at the turn of the millennium a hadron–electron separator was built for the ZEUS experiment at HERA, DESY.
The institute delved into theoretical studies in 1983, with the establishment of the laboratory of symbolic computations in high-energy physics and, in 1990, the department of theoretical high-energy physics. One of its most striking achievements was the creation of the CompHEP software package, which has received global recognition for its ability to automate calculations of collisions between elementary particles and their decays within the framework of gauge theories. This is freely available and allows physicists (even those with little computer experience) to calculate cross sections and construct various distributions for collision processes within the Standard Model and its extensions. Members of the department later went on to make a significant contribution to the creation of a Tier-2 Grid computer segment in Russia for processing and storing data from the LHC detectors.
Over the past 35 years of research in the field of particle accelerators at SINP MSU, research has moved from the development of large accelerator complexes for fundamental research, to now focusing on the creation and production of applied accelerators for security systems, industry and medicine.
Teaching legacy
Throughout its 75 years, SINP MSU has also nurtured thousands of students. In 1961 a new branch of SINP MSU, the department for nuclear research, was established in Dubna. It became the basis for training students from the MSU physics faculty in nuclear physics using the capabilities of the largest international scientific centre in Russia – JINR. The department, which is still going strong today, teaches with a hands-on approach, with students attending lectures by leading JINR scientists and taking part in practical training held at the JINR laboratories.
The institute is currently participating in the upgrade of the LHC detectors (CMS, ATLAS, LHCb) for the HL-LHC project, as well as in projects within the Physics Beyond Colliders initiative (e.g. NA64, SHiP). These actions are under the umbrella of a 2019 cooperation agreement between CERN and Russia concerning high-energy physics and other domains of mutual interest. Looking even further ahead, SINP MSU scientists are also working on the development of research programmes for future collider projects such as the FCC, CLIC and ILC. Furthermore, the institute is involved in the upcoming NICA Complex in Russia, which plans to finish construction in 2022.
After 75 years, the institute is still as relevant as ever, and whatever the next chapter of particle physics will be, SINP MSU will be involved.
Imagine standing in the LHC tunnel when the machine is operating. Proton beams are circulating around the 27 km ring more than 11,000 times per second, colliding at four points to generate showers of particles that are recorded by ATLAS, CMS, ALICE, LHCb and other detectors. After a few hours of operation, the colliding beams need to be disposed of to allow a new physics fill. Operators in the CERN control centre instruct beam-transfer equipment to shunt the circulating beams into external trajectories that transport them away from the cryogenic superconducting magnets. Each beam exits the ring and travel for 600 metres in a straight line before reaching a compact cavern housing a large steel cylinder roughly 9 m long, 70 cm in diameter and containing about 4.4 tonnes of graphitic material. Huge forces are generated in the impact. If you could witness the event up close, you would hear a massive “bang” – like a bell – generated by the sudden expansion and successive contraction of the steel shell.
What you will have witnessed is a beam-intercepting system in action. Of course, experiencing a beam dump in person is not possible, due to the large amount of radiation generated in the impact, which is one of the reasons why access to high-energy accelerators is strictly forbidden during operation.
Beam-intercepting systems are essential devices designed to absorb the energy and power of a particle beam. Generally, they are classified in three categories depending on their use: particle-producing devices, such as targets; systems for beam cleaning and control, such as collimators or scrapers; and those with safety functions, such as beam dumps or beam stoppers. During the current long-shutdown 2 (LS2), several major projects have been undertaken to upgrade some of the hundreds of beam-intercepting systems across CERN’s accelerator complex, in particular to prepare the laboratory for the high-luminosity LHC era.
Withstanding stress
Beam-intercepting devices have to withstand enormous mechanical and thermally-induced stresses. In the case of the LHC beam dump, for example, upgrades of the LHC injectors will deliver a beam which at high energy will have a kinetic energy equivalent to 560 MJ during LHC Run 3, roughly corresponding to the energy required to melt 2.7 tonnes of copper. Released in a period of just 86 μs, this corresponds to a peak power of 6.3 TW or, put differently, 8.6 billion horse power.
In general, the energy deposited in beam-intercepting devices is directly proportional to the beam energy, its intensity and the beam-spot size, as well as to the density of the absorbing material. From the point of view of materials, this energy is transformed into heat. In a beam dump, for example, the collision volume (which is usually much smaller than the beam-intercepting device itself) is heated to temperatures of 1500 C or more. This heat causes the small volume to try to expand but, because the surrounding area has a much lower temperature, there is no room for expansion. Instead, the hot volume pushes against the colder surrounding area, risking breaking the structure. To reach a sufficient attenuation, due to the high energy of the beams in CERN’s accelerators, we need devices that in some cases are several metres long.
Beam-intercepting devices must be able to withstand routine operation and also accident scenarios, where they serve to protect more delicate equipment such as cryomagnets. Amongst the many challenges that need to be faced are operation under ultra-high-vacuum conditions, and maintaining integrity and functionality when enduring energy densities up to several kJ/cm3 or power densities up to several MW/cm3. For physics applications, optimisation processes have led to the use of low-strength materials, such as pure lead for the generation of neutrons at the n_TOF facility or iridium and tantalum for the generation of antiprotons at the Antiproton Decelerator (AD) facility.
Preparing for HL-LHC
The LHC Injectors Upgrade (LIU) Project, which was launched in 2010 and for which the hardware was installed during LS2, will allow beams with a higher intensity and a smaller spot size to be injected into the LHC. This is a precondition for the full execution of the High-Luminosity LHC (HL-LHC), which will enable a large increase in the integrated luminosity collected by the experiments. To safely protect sensitive equipment in the accelerator chain, the project required a series of new devices in the injector complex from the PS Booster to the SPS, including new beam-intercepting devices. One example is the new SPS internal beam dump, the so-called TIDVG (Target Internal Dump Vertical Graphite), which was installed in straight-section five of the SPS during 2020 (see “Structural integrity” image). The main challenge faced for this device was the need to dissipate a large amount of power from the device rapidly and efficiently to avoid reaching temperatures not acceptable by the beam-dump materials.
The TIDVG is used to dispose of the SPS circulating beam whenever necessary, for example in case of emergency during LHC beam-setup, filling or machine-development periods, and to dispose of the part of the beam dedicated to fixed-target experiments that remains after the slow-extraction process. Aiming at reducing the energy density deposited in the dump core’s absorbing material (and hence minimising the associated thermo-mechanical stresses), the beam is diluted by kicker magnets, producing a sinusoidal pattern on the front of the first absorbing block. The dump is designed to absorb all beam energies in the SPS, from 14 GeV (injection from the PS) to 450 GeV.
The LHC Injectors Upgrade Project will allow beams with a higher intensity and a smaller spot size to be injected into the LHC
With respect to the pre-LS2 device, the beam power to be absorbed by the dump will be four-times higher, with an average power of 300 kW. To reduce the local energy deposition whilst maintaining the total required beam absorption, the length of the new dump has been increased by 70 cm, leading to a 5 m-long dump. The dump blocks are arranged so that the density of the absorbing materials increases as the beam passes through the device: 4.4 m of isostatic graphite, 20 cm of a molybdenum alloy and 40 cm of pure tungsten. This ensures that the stresses associated with the resulting thermal gradients are kept within acceptable values. The core of the component, which receives the highest thermal load, is cooled directly by a dedicated copper-alloy jacket surrounding the blocks, which can only release their heat through the contact with the jacket; to maximise the thermal conductivity at the interfaces between the stainless-steel cooling pipes and the copper alloy, these materials are diffusion-bonded by means of hot isostatic pressing. The entire core is embedded in an air-cooled, seamless 15 mm-thick stainless-steel hollow cylinder. Due to the high activation of the dump expected after operation, in addition to the first cast-iron shielding, the assembly is surrounded by a massive, multi-layered external shield comprising an inner layer of 50 cm of concrete, followed by 1 m of cast iron and an external layer 40 cm of marble. Marble is used on the three sides accessible by personnel to minimise the residual dose rate in the vicinity after short cool-down times.
Collimator system upgrades
Beam collimators and masks are essential components in accelerator systems. They act as intermediate absorbers and dilutors of the beam in case of beam losses, minimising the thermal energy received by components such as superconducting magnets (leading to quench) or delicate materials in the LHC experiments. The other function of the collimators is to clean up the halo of the beam, by removing particles moving away from the correct orbit. Collimators generally consist of two jaws – moveable blocks of robust materials – that close around the beam to clean it of stray particles. More than 100 of these vital devices are placed around the LHC in critical locations.
The jaw materials can withstand extreme temperatures and stresses (resulting in deposited energy densities up to 6 kJ/cm3), while maintaining – at least for the LHC collimators – good electrical conductivity to reduce the impedance contribution to the machine. Several developments were incorporated in the SPS-to-LHC transfer line collimators built in the framework of the LIU project, as well as in the LHC collimators for the HL-LHC. For the former, dedicated and extremely robust 3D carbon-composite materials were developed at CERN in collaboration with European industry, while for the latter, dedicated molybdenum carbide-graphite composites were developed, again in collaboration with European firms. For these cases, more than 30 new collimators have been built and installed in the SPS and LHC during LS2 (see “New collimators” image).
LHC beam-dump upgrades
Several challenges associated with the LHC beam dump system had to be overcome, especially on the dump-block itself: it needs to be ready at any time to accept protons, from injection at 450 GeV up to top energy (6.5 TeV, with up to 7 TeV in the future); it must be reliable (~200 dump events per year); and it must accept fast-extracted beams, given that the entire LHC ring is emptied in just 86 μs. At 560 MJ, the projected stored beam energy during Run 3 will also be 75% higher than it was during Run 2.
The dump core (around 8 m long) consists of a sandwich of graphitic materials of sufficiently low density to limit the temperature rise – and therefore the resulting thermal-induced stresses – in the material (see “End of the line” image). The graphite is contained in a 12 mm-thick special stainless-steel grade (see “Dump upgrades” image) and the assembly is surrounded by shielding blocks. Roughly 75% (±430 MJ) of the energy that gets deposited by either electromagnetic shower and ionisation losses of hadrons and muons is deposited in the graphite, while around 5% (±25 MJ) is deposited in the thin steel vessel, and the remaining energy is deposited in the shielding assembly. Despite the very low density (1.1 g/cm3) employed in the middle section of the core, temperatures up to 1000 C have been reached during Run 2. From Run 3, temperatures up to 1500 C will be reached. These temperatures could be much higher if it were not for the fact that the beam is “painted” on the face of the dump by means of dilution kickers situated hundreds of metres upstream. The dump must also guarantee its structural integrity even in the case of failures of these dilution systems.
Although the steel vessel is responsible for absorbing just 5% of the deposited energy, the short timescales involved lead to a semi-instantaneous rise in temperature of more than 150 C, generating accelerations up to 2000 g and forces of several hundred tonnes. Following the operational experience during LHC Run 1 and Run 2, during LS2 several upgrades have been implemented on the dump. These include complex instrumentation to yield information and operational feedback during Run 3, until 2025. In the later HL-LHC era, the dump will have to absorb an additional 50% more energy per dump than during Run 3 (up to 750 MJ/dump), presenting one of numerous beam-interception challenges to be faced.
Fixed-target challenges
Beyond the LHC, challenging conditions are also encountered for antiproton production at CERN’s Antiproton Decelerator (AD), which serves several antimatter experiments. In this case, high-density materials are required to make sources as point-like as possible to improve the capture capabilities of the downstream magnetic-horn focusing system. Energy densities up to 7 kJ/cm3 and temperatures up to 2500 C are reached in refractory materials such as iridium, tantalum and tungsten. Such intense energy densities and the large gradients resulting from the very small transverse beam size generate large thermal stresses and produce damage in the target material, which must be minimised to maintain the reliability of the AD’s physics programme. To this end, a new air-cooled antiproton production target will be installed in the antiproton target area this year. Similar challenges are faced when producing neutrons for the n_TOF facility: in this case a new nitrogen-cooled pure lead spallation target weighing roughly 1.5 tonnes will be commissioned this year, ready to produce neutrons spanning 11 orders of magnitude in energy, from 25 meV to several GeV (see “Neutron production target” image).
Reliability is a key aspect in the construction of beam-intercepting devices, not just because machine operation strongly depends on them, but because replacing devices is not easy due to their residual radioactivation after operation. But how do we know that new devices will fulfill their function successfully once installed in the machine? CERN’s HiRadMat facility, which allows single proton pulse testing using a high-intensity beam from the SPS, is one solution. Extremely high energy densities can be reached in test materials and in complex systems, allowing the experimental teams to investigate – in a controlled manner – the behaviour of materials or complex mechanical systems when impacted by proton (or ion) beams. During the past few years, the facility was heavily employed by both CERN and external teams from laboratories such as STFC, Fermilab, KEK and GSI, testing materials from graphite to copper and iridium across the whole spectrum of densities (see “Material integrity test” image). To be able to correctly predict the behaviour of materials when impacted by protons and other charged particles, a full understanding of thermo-physical and material properties is mandatory. Examples of critical properties include the coefficient of thermal expansion, heat capacity, thermal and electrical conductivity as well as the Young’s modulus and yield strength, as well as their temperature dependence.
Dealing with radiation damage is becoming increasingly important as facilities move to higher beam intensities and energies, presenting potential show-stoppers for some beam-intercepting devices. To better understand and predict the radiation response of materials, the RaDIATE collaboration was founded in 2012, bringing together the high-energy physics, nuclear and related communities. The collaboration’s research includes determining the effect of high-energy proton irradiation on the mechanical properties of potential target and beam-window materials, and developing our understanding via micro-structural studies. The goal is to enable accurate lifetime predictions for materials subjected to beam impact, to design robust components for high-intensity beams, and to develop new materials to extend lifetimes. CERN is partner to this collaboration, as well as Fermilab, STFC/UKRI, Oak Ridge, KEK, Pacific Northwest National Laboratory, and other institutions and laboratories worldwide.
Future projects
High-energy physics laboratories across the world are pursuing new energy and/or intensity frontiers, either with hadron or lepton machines. In all cases, whether collider physics or fixed-target, neutrino or beam-dump experiments, beam-intercepting devices are at the heart of accelerator operations. For the proposed 100 km-circumference Future Circular Collider (FCC), several challenges have already been identified. Owing to the small emittances and high luminosities involved in a first electron–positron FCC phase, the positron source system, and its target and capture system, will require dedicated R&D and testing as well as the two lepton dumps. FCC’s proton–proton phase, further in the future, will draw on lessons from the HL-LHC operation, but it will also operate at uncharted energy densities for beam-intercepting devices, both for beam cleaning and shaping collimators as well as for the beam dumps.
The recently launched muon-collider initiative, meanwhile, will require a target system capable of providing copious amounts of muons generated either by proton beams or electrons impacting on a target, depending on the scheme under consideration. For the former, beams of several MW could collide on a production target, which will have to be very efficient to produce muons of the required momenta while being sufficiently reliable to operate without failure for long periods. The muon collider target and front-end systems will also require magnets and shielding to be located quite close to the production target and will have to cope with radiation load and heat deposition. These challenges will be tackled extensively in the next few years, both from a physics and engineering perspective.
Successful beam-intercepting devices require extensive knowledge and skills
As one of the front-runner projects in the Physics Beyond Colliders initiative, the proposed Beam Dump Facility at CERN would require the construction of a general-purpose high-intensity and high-energy fixed-target complex, initially foreseen to be exploited by the Search for Hidden Particles (SHiP) experiment. At the heart of the installation resides a target/dump assembly that can safely absorb the full high-intensity 400 GeV/c SPS beam, while maximising the production of charm and beauty mesons and using high-Z materials, such as pure tungsten and molybdenum alloy, to reduce muon background for the downstream experiment. The nature of the beam pulse induces very high temperature excursions between pulses (up to 100 °C), leading to considerable thermally induced stresses and long-term fatigue considerations. The high average power deposited on target (305 kW) also creates a challenge for heat removal. A prototype target was built and tested at the end of 2018, at one tenth of the nominal power but able to reach the equivalent energy densities and thermal stresses (see “Beam-dump facility” image).
Human efforts
The development, construction and operation of successful beam-intercepting devices require extensive knowledge and skills, ranging from mechanical and nuclear engineering, to physics, vacuum technologies and advanced production techniques. Technicians also constitute the backbone of the design, assembly and installation of such equipment. International exchanges with experts in the fields and with laboratories working with similar challenges is essential, as is cross-discipline collaboration, for example in aerospace, nuclear and advanced materials. In addition, universities provide key students and personnel capable of mastering and developing these techniques both at CERN and in CERN’s member states’ laboratories and industries. This intense multidisciplinary effort is vital to successfully tackle the challenges related to current and future high-energy and high-intensity facilities and infrastructures, as well as to develop systems with broader societal impact, for example in X-ray synchrotrons, medical linacs, and the production of radioisotopes for nuclear medicine.
Experimental nuclear physicist Haiyan Gao has been appointed associate laboratory director for nuclear and particle physics at Brookhaven National Laboratory (BNL), beginning 1 June. Gao, whose research interests include the structure of the nucleon, searches for exotic QCD states and searches for new physics in electroweak interactions, is currently a professor of physics at Duke University, and has previously held positions at Argonne National Laboratory and Massachusetts Institute of Technology (MIT) . At BNL she replaces Dmitri Denisov, who has held the position on an interim basis after Berndt Mueller’s departure last year.
While at Duke, Gao was the founding vice chancellor for academic affairs at the new Duke Kunshan University, based in Kushan, China — a Chinese-American academic partnership between Duke University and Wuhan University established in 2013.
I am very excited by the opportunity and the impact I will be able to make in collaboration with many people at the lab
The appointment comes at a vital time for BNL, with preparations taking place for the Electron-Ion Collider, which expects first physics in the next decade. The unique facility will, for the first time, be able to systematically explore and map out the dynamical system that is the ordinary QCD bound state. On the appointment, Gao states: “The nuclear & particle physics directorate is well-known internationally in accelerator science, high-energy physics, and nuclear physics. I am very excited by the opportunity and the impact I will be able to make in collaboration with many people at the Lab.”
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.