The study of elastic hadron scattering is a cornerstone in understanding the non-perturbative properties of strong interactions. A key role is played by experiments at the LHC, where it is possible to precisely measure proton–proton (pp) interactions at a very high centre-of-mass energy. The goal is to detect the process pp → pp, in which the interacting protons remain intact and are scattered at angles of a few microradians with respect to the beamline. The importance of such measurements follows from their relation to the total hadronic pp cross section via the optical theorem, and to properties of proton interactions at asymptotically high energies via dispersion relations.
In ATLAS, elastic scattering is studied using a dedicated experimental setup – the ALFA detectors, which allow measurements of scattered-proton trajectories inside the beam pipe, just a few millimetres from the LHC beam. They are installed inside so-called Roman pots located at distances of 237 and 245 m on either side of the ATLAS interaction point.
Recently, ATLAS reported a measurement of elastic scattering at a centre-of-mass energy of 13 TeV. The data were collected with a special setting of the LHC magnets characterised by a high β* of 2500 m, which results in a large beam-spot size and a very small beam divergence. The latter allows precise measurements of small scattering angles. With these optics, the ALFA system detected events characterised by very small values of the Mandelstam t variable, which is proportional to the scattering angle squared. Measurements of small |t| values give access to the Coulomb–nuclear interference (CNI) kinematic region, where the contribution from electromagnetic and strong interactions are of similar magnitude.
The ALFA detectors use scintillating- fibre technology to measure the position of the passing proton. The t value for each event is reconstructed from the measured positions using knowledge of the magnetic fields of the LHC magnets between the interaction point and the detectors. The selection of candidate events is based on the strong correlations between the elastically scattered protons, resulting from energy and momentum conservation. The analysis is heavily based on data-driven techniques, which are used for the alignment of the detectors, background estimation, evaluation of reconstruction efficiency and optics tuning.
Figure 1 presents the measured differential elastic cross section as a function of t. The shape of the distribution is sensitive to important physics parameters, such as the total cross section (σtot) and the ρ parameter, defined as the ratio of real to imaginary parts of the forward scattering amplitude. The smallest |t| values, and thus the smallest scattering angles, are dominated by the electromagnetic interaction between the protons. The CNI effects are strongest for |t| around 10–3 GeV2 and provide the sensitivity to the ρ parameter. For larger |t| values, the strong interaction dominates, and the spectrum depends on the value of σtot. The physics parameters are extracted from a fit to the t distribution.
The ρ parameter is related, through dispersion relations, to the energy dependence of σtot, with a certain sensitivity also to energies above those at the LHC. In addition, ρ is sensitive to possible differences between pp and pp scattering amplitudes at asymptotic energies. ATLAS measured ρ = 0.098 ± 0.011, in agreement with a previous TOTEM measurement. The result is in conflict with pre-LHC theoretical expectations (see the COMPETE line in figure 2), which assumed that no pp/pp difference is present asymptotically and that σtot increases proportionally to the squared logarithm of the centre-of-mass energy, similarly to the evolution observed at accessible energies back then. This suggests that one of the above assumptions is incorrect: either the increase of σtot slows down above LHC energies, or protons and antiprotons interact differently at asymptotic energies. The second statement is often associated with the so-called odderon exchange. Both possibilities affect our understanding of the high-energy behaviour of strong interactions.
ATLAS also measured the total pp hadronic cross section σtot = (104.7 ± 1.1) mb. This is the most precise measurement to date at this energy, due to a dedicated luminosity measurement that contributed less than 1 mb to the total systematic uncertainty. However, the long-standing tension between the ATLAS and TOTEM σtot measurements, with the latter being about 5% higher than ATLAS, persists.
ATLAS has collected more elastic scattering data in LHC Run 2, which are currently being analysed. New data taking is planned during Run 3, where a special run is foreseen at a centre-of-mass energy of 13.6 TeV.
At the LHC, light nuclei and antinuclei are produced both in proton–proton and in heavy-ion collisions. Unstable nuclei, called hypernuclei, are also produced. First observed in cosmic rays in 1953, hypernuclei are formed by a mix of protons, neutrons and hyperons containing one or more strange quarks and undergo weak decays. Almost 70 years since their discovery, hypernuclei are still a source of fascination for nuclear physicists since their production is very rare and the measurement of their properties is extremely challenging.
The only hypernucleus observed so far at the LHC is the hypertriton (3ΛH), composed of a Lambda baryon (Λ), a proton and a neutron. While, traditionally, hypernuclei are studied in low-energy nuclear experiments, the hundreds of hypertritons and antihypertritons produced in each lead–lead run at the LHC provide one of the largest data samples for their study. The hypertritons fly for a few centimetres in the experimental apparatus before decaying into a 3He nucleus and a charged pion, which are then identified by the detectors.
The ALICE collaboration recently completed a new analysis of the largest Run 2 data sample, achieving the most precise measurements to date of the hypertriton lifetime and its Λ-separation energy (the energy required to separate the Λ from the rest of the hypertriton). The lifetime, measured from the distribution of reconstructed two-body decay lengths, was found to be 253 ± 11 (stat.) ± 6 (syst.) ps, while the separation energy, obtained from the hypertriton invariant-mass distribution, was measured to be 72 ± 63 (stat.) ± 36 (syst.) keV.
These two quantities are fundamental to understand the structure of this hypernucleus and therefore the nature of the strong interaction. While the strong force binding neutrons and protons inside atomic nuclei is well understood, the characteristics of the strong force binding nucleons and hyperons are not precisely known.
The study of this interaction is not only interesting per se, but it is also an input for modelling of the dense core of neutron stars. Indeed, the creation of hyperons is energetically favoured compared to ordinary nucleonic matter in the inner core of neutron stars. Therefore, detailed knowledge of the interactions between nucleons and hyperons is required to understand these compact astrophysical objects.
The new ALICE measurements indicate that the interaction between the hyperon inside the hypertriton and the other two nucleons is extremely feeble (see figure 1). This is also confirmed by the lifetime of the hypertriton, which is compatible with the free Λ-baryon lifetime. Finally, since at the LHC matter and antimatter are produced in the same amount, the ALICE collaboration could compare the lifetimes of the antihypertriton and the hypertriton. Within the experimental uncertainty, the lifetimes were found to be compatible, as expected from CPT invariance.
During LHC Run 3, ALICE will extend its studies to heavier hypernuclei, putting tighter constraints on the interaction models among hyperons and nucleons.
This year, the International Particle Physics Outreach Group (IPPOG) celebrates its 25th anniversary. Our group is an international collaboration of active particle physicists, communication experts and educators dedicated to disseminating the goals and accomplishments of fundamental scientific research to the public. Our audiences range from young schoolchildren to college graduates and teachers, from the visiting public to heads of state, and we engage them in classrooms, laboratories, experimental halls, music festivals, art exhibitions, office buildings and government offices across the planet. The activities we use to reach these diverse audiences include public lectures, visits, games, exhibits, books, online apps, and pretty much anything that can be used to demonstrate scientific methodology and instil appreciation for fundamental research.
What drives us to commit so much effort to outreach and public engagement when we are already deeply invested in a field that is both time and labour intensive? First of all, we love doing it. Particle physics is an active and exciting field that lies at the forefront of human understanding of the universe. The analysis methods and tools we employ are innovative, our machines and detectors are jaw-dropping in their size and complexity, and our international collaborations are the largest, most diverse ever assembled. It is a privilege to be part of this community and we love sharing that with those around us.
Secondly, we’ve learned that public engagement improves us as scientists. Learning how to distil complex scientific concepts into understandable descriptions, captivating stories and relatable analogies helps us to better comprehend the topics ourselves. It gives us a clearer picture in our own minds of where our work fits into the larger frame of society. It also significantly improves our communication skills, yielding capabilities that help us down the road as we apply for grants and propose new projects or analyses.
Thirdly, we also understand our moral obligation to share the results of our research with society. The endeavour to improve our understanding of the world around us is rooted in millennia of human evolution and culture. It is how we not only improve our own lives, but also how we provide the tools future generations need to survive. In more practical terms, we realise the importance of engaging those who support our research. That includes funding bodies, as well as the voters who select those bodies and prioritise the deployment of resources.
Finally, but equally as important, we realise the value to both our own field and society at large of encouraging our youth to pursue careers in science and technology. The next generation of experiments will include machines, detectors and collaborations that are even larger than the ones we have today. Given their projected lifetimes, the grandchildren of today’s students will be among those analysing the data. And we need to train that workforce today.
Birth and evolution
Dedicating time and resources to outreach efforts is not easy. As researchers, our days (and often nights) are taken up by analysis meetings, detector shifts, conference deadlines and institutional obligations. So, when we do make the effort it needs to be done in an effective manner, reaching as large and diverse an audience as possible, with messages that are clear and coherent.
Former CERN Director-General Chris Llewellyn Smith certainly had this in mind when he first proposed the establishment of the European particle physics outreach group (EPOG) in 1997. The group held its first meeting on 19 September that year, under the chairmanship of Frank Close. Its primary objectives were to exchange ideas and best practices in particle-physics education and outreach, to define common goals and activities, and to develop and share materials supporting the efforts.
The original members of EPOG were delegates from the CERN Member States, one each from the four major LHC experiments, one each from the CERN and DESY laboratories, and a chair and deputy chair assigned by the European Committee for Future Accelerators (ECFA) and the high-energy physics branch of the European Physical Society. Over the course of the following 25 years, EPOG has expanded beyond Europe (becoming IPPOG), developed major worldwide programmes, including International Masterclasses in Particle Physics and Global Cosmics, and established itself as an international collaboration, defined and supported by a memorandum of understanding (MoU).
Today, the IPPOG collaboration comprises 39 members (32 countries, six experiments and one international lab) and two associate members (national labs). Each member, by signing the MoU, commits to supporting particle-physics outreach worldwide. Members also provide modest funding, which is used to support IPPOG’s core team, its administration and communication platforms, thus facilitating the development and expansion of its global programmes and activities.
The Masterclasses programme now reaches tens of thousands of students in countries spread around the globe, and is engaging new students and training new physics mentors every year. The Global Cosmics portal, hosted on the IPPOG website, provides access to a wide variety of projects distributing cosmic-ray detectors and/or data into remote classrooms that would not normally have access to particle-physics research. And a modest project budget has helped the IPPOG collaboration to establish a presence at science, music and art festivals around the globe by supporting the efforts of local researchers.
Most recently, IPPOG launched a new website, featuring information about the collaboration, its programmes and activities, and giving access to a growing database of educational resources. The resource database serves teachers and students, as well as our own community, and includes searchable, high-quality materials, project descriptions and references to related resources procured and contributed by our colleagues.
Our projects and activities are reviewed and refined periodically during twice-annual collaboration meetings hosted by member countries and laboratories. They feature hands-on activities and presentations by working groups, members, partners and panels of world-renowned topical experts. We present and publish the progress of our activities each year during major physics and science-education conferences. Presentations are made in parallel and poster sessions, and plenary talks, to share developments with the greater community and offer opportunities for their own contributions.
The challenging road ahead
While these accomplishments are noteworthy and lay a strong basis for the development of particle-physics outreach, they are not enough to face the challenges of tomorrow. Or even today. The world has changed dramatically since the days we first advocated for the construction of our current accelerators and detectors. And we are partly to blame. The invention of the web at CERN more than 30 years ago greatly facilitated access to and propagation of scientific facts and publications. Unfortunately, it also became a tool for the development and even faster dissemination of lies and conspiracy theories.
Effective science education is crucial to stem the tide of disinformation. A student in a Masterclass, for example, learns quickly that truth is found in data: only by selecting events and plotting measurements is she/he able to discover what nature has in store. And it might not agree with her/his original hypothesis. It is what it is. This simple lesson teaches students how to extract signal from background, truth from fiction. Other important lessons include the value of international collaboration, the symmetries and beauty of nature and the applications of our technology to society.
How do we, as scientists, make such opportunities available to a broader audience? First and foremost, we need more of us doing outreach. Many physicists do not make the effort because they perceive it as costly to their career. Taking time away from analysis and publication can be detrimental to our advancement, especially for students and junior faculty, unless there is sufficient support and recognition. Our community needs to recognise that outreach has become a key component of scientific research. It is as essential as hardware, computing and analysis. Without it, we won’t have the support we need to build future facilities. That means senior faculty must value experience in outreach on a par with other qualities when selecting new hires, and their institutions need to support outreach activities.
We also need to increase the diversity of our audience. While particle physics can boast of its international reach, our membership is still quite limited in social, economic and cultural scope. We are sorely missing women, people of diverse ethnicities and minorities. Communication strategies and educational methods can be adopted to address this, but they require resources and dedicated personnel.
That’s what IPPOG is striving for. Our expertise and capabilities increase with membership, which is continually on the rise. This past year we have been in discussion with Mexico, Nepal, Canada and the Baltic States, and more are planned for the near future. Some will sign up, others may need more time, but all are committed to maintaining and improving their investment in science education and public engagement. We invite Courier readers to join us in committing to build a brighter future for our field and our world.
The challenge of casting space–time and gravity in the language of quantum mechanics and unravelling their fundamental structure has occupied some of the best minds in physics for almost a century. Not only is it one of the hardest problems out there – requiring mastery of general relativity, quantum field theory, high-level mathematics and deep conceptual issues – but distinct sub-communities of researchers have developed around different and apparently mutually exclusive approaches.
Historically, this reflected to a large extent the existing subdivision of theoretical physics between the particle-physics community and the much smaller gravitational physics one, with condensed-matter theorists entirely alien, at the time, to the quantum gravity (QG) problem. Until 30 years ago, the QG landscape roughly featured two main camps, often identified simply with string theory and canonical loop quantum gravity, even if a few more hybrid formalisms already existed. Much progress has been achieved in this divided landscape, somehow maintaining each camp in the belief that one only had to push forward its own strategies to succeed. At a more sociological level, intertwined with serious scientific disagreements, this even led, in the early 2000s, to what the popular press dubbed the “String Wars”.
A new generation has grown up in a diverse, if conflicting, scientific landscape
Today there is a growing conviction that if we are going to make progress towards this “holy grail” of physics, we need to adopt a more open attitude. We need to pay serious attention to available tools, results and ideas wherever they originated, pursuing unified perspectives when suitable and contrasting them in a constructive manner otherwise. In fact, the past 30 years has seen the development of several QG approaches, the birth of new (hybrid) ones, fresh directions and many results. A new generation has grown up in a diverse, if conflicting, scientific landscape. Today there is much more emphasis on QG phenomenology and physical aspects, thanks to parallel advances in observational cosmology and astrophysics, alongside the recognition that some mathematical developments naturally cut across specific QG formalisms. There is also much more contact with “outside” communities such as particle physics, cosmology, condensed matter and quantum information, which are not interested in internal QG divisions but only in QG deliverables. Furthermore, several scientific overlaps between QG formalisms exist and are often so strong that they make the definition of sharp boundaries between them look artificial.
Introducing the ISQG
The time is ripe to move away from the String Wars towards a “multipolar QG pax”, in which diversity does not mean irreducible conflict and disagreement is turned into a call for better understanding. To this end, last year we created the International Society for Quantum Gravity (ISQG) with a founding committee representing different QG approaches and more than 400 members who do not necessarily agree scientifically, but value intelligent disagreement.
ISQG’s scientific goals are to: promote top-quality research on each QG formalism and each open issue (mathematical, physical and in particular conceptual); stimulate cross-fertilisation across formalisms (e.g. by focusing on shared mathematical ingredients/ideas or on shared physical issues); be prepared for QG observations and tests (develop a common language to interpret experiments with QG implications, and a better understanding of how different approaches would differ in predictions); and push for new ideas and directions. Its sociological goals are equally important. It aims to help recognise that we are a single community with shared interests and goals, overcome barriers and diffidence among sub-communities, support young researchers and promote QG outside the community. A number of initiatives, as well as new funding schemes, are being planned to help achieve these goals.
We envision the main role of the ISQG as sponsoring and supporting the initiatives proposed by its members, in addition to organising its own. This includes a bi-annual conference series to be announced soon, focused workshops and schools, seminar series, career support for young researchers and the preparation of outreach and educational material on quantum gravity.
So far, the ISQG has been well received, with more than 100 participants attending its inaugural workshop in October 2021. Researchers in quantum gravity and related fields are welcome to join the society, contribute to its initiatives and help to create a community that transcends outdated boundaries between different approaches, which only hinder scientific progress. We need all of you!
The start of LHC Run 3 in 2022 marked an important milestone for CERN: the first step into the High-Luminosity LHC (HL-LHC) era. Thanks to a significant upgrade of the LHC injectors, the Run 3 proton beams are more intense than ever. Together with the raised centre-of-mass collision energy from 13 to 13.6 TeV, Run 3 offers a rich physics programme involving the collisions of both proton and heavy-ion beams. This is made possible thanks to several important upgrades involving HL-LHC hardware that were carried out during Long Shutdown 2 (LS2), ahead of the full deployment of the HL-LHC project during LS3, around four years from now.
The HL-LHC aims to operate with 2.3 × 1011 protons per bunch (compared to the goal of 1.8 × 1011 protons per bunch at the end of Run 3), producing a stored beam energy of about 710 MJ (compared to 540 MJ in Run 3). Lead–ion beams, on the other hand, will already reach their HL-LHC target intensity upgrade in Run 3. This is thanks to the “slip stacking” technique currently implemented at the Super Proton Synchrotron, which uses complex radio-frequency manipulations to shorten the bunch spacing of LHC beam trains from 75 to 50 ns. Equating to a stored beam energy of up to 20.5 MJ at 6.8 TeV (compared to a maximum of 12.9 MJ achieved in 2018 at 6.37 TeV), the full HL-LHC upgrade needed to handle these more intense ion beams must be available throughout Run 3.
When the LHC works as a heavy-ion collider, many specific challenges need to be faced. Magnetically, the machine behaves in a similar way as during proton–proton operation. However, since the lead–ion bunch charge is about 15 times lower than for protons, a number of typical machine challenges – such as beam–beam interactions, impedance, electron-cloud effects, injection and beam-dump protection – are relaxed. Mitigating the nuisance of beam halos, however, is certainly not one of the tasks that gets easier.
These halos are formed by particles that stray from the ideal beam orbit. More than 100 collimators are located at specific locations in the LHC to ensure that errant particles are cleaned or absorbed, thus protecting sensitive superconducting and other accelerator components. Although the total stored beam energy with ions is more than 30 times lower than it is for protons, the conventional multi-stage collimation system at the LHC (see “Multi-stage collimation” figure) is about two orders of magnitude less efficient for ion beams. Nuclear fragmentation processes occurring when ions interact with conventional collimator materials produce ion fragments with different magnetic rigidities without producing transverse kicks sufficient to steer these fragments onto the secondary collimators. Instead, they travel nearly unperturbed through the “betatron” collimation system in interaction region 7 (IR7) responsible for disposing safely of transverse beam losses. This creates clusters of losses in the high-dispersion regions, where the first superconducting dipole magnets of the cold arcs act as powerful spectrometers, increasing the risk of quenches whereby the magnets cease to become superconducting.
The ion-collimation limitation is a well-known concern for the LHC. Nevertheless, the standard system has performed quite well so far and provided adequate cleaning efficiency for the nominal LHC ion-beam parameters. But the HL-LHC targets pose additional challenges. In particular, the upgrade does not allow sufficient operational margins without improving the betatron collimation cleaning. Lead–ion beam losses in the cold dipole magnets downstream of IR7 might reach a level three times higher than their quench limits, estimated at their 7 TeV current equivalent.
Various paths have been followed within the HL-LHC project to address this limitation. The baseline solution was to improve the collimation cleaning by adding standard collimators in the dispersion-suppressor regions that would locally dispose of the off-momentum halo particles before they impact the cold magnets. To create the necessary space, two shorter dipoles with a stronger (11 T) field would replace a standard, 15 m-long 8.3 T LHC dipole. This robust upgrade, which works equally well for proton beams, was planned to be used in Run 3. However, due to technical issues with the availability of the new dipoles, which are based on a niobium-tin rather than niobium-titanium conductor, the decision was taken to defer their installation. The HL-LHC project now relies on an alternative solution based on a crystal collimation scheme that was studied in parallel.
Crystals in the LHC
The development of crystal applications with hadron beams at CERN dates back to the activities carried out by the UA9 collaboration at the CERN SPS. Crystal collimation makes use of a phenomenon called planar channelling: charged particles impinging on a pure crystal with well-defined impact conditions can remain trapped in the electromagnetic potential well generated by the regular planes of atoms. If the crystal is bent, particles follow its geometrical shape and experience a net kick that can steer them with high efficiency to a downstream absorber. Crystal collimation was tested at the Tevatron, and in 2018 a prototype system was used for protons at the LHC in a special run at injection energy. The scheme is particularly attractive for ion beams as it was demonstrated that the existing secondary collimators can serve as a halo absorber without risking damage.
At the LHC, a total of four bent crystals are needed for the horizontal and vertical collimation of both beams. During Run 2, a test stand for crystal-collimation tests was installed in the LHC betatron cleaning region of IR7 with the aim of demonstrating the feasibility of this advanced collimation technique at LHC energies. Silicon crystals with a length of just 4 mm were bent to a curvature radius of 80 m to produce a 50 μrad deflection – much larger than the few-μrad angles typically experienced by proton interaction with the 60 cm-long primary collimators (see “Silicon swerve” image). Indeed, to produce such a kick with conventional dipole magnets would require a field of around 300 T in the same volume of the crystal. The crystals were mounted on an assembly (see “On target” image) that is a jewel of accelerator technology and control: the target collimator primary crystal (TCPC). This device allows the crystal to be moved to the desired distance from the circulating beam – typically just a few millimetres at 7 TeV – and its angular orientation to be adjusted to better than 1 μrad. While the former is no more demanding than the control system of other LHC collimators, the angular control demands a customised technology that is the heart of LHC crystal collimation.
Crystal channelling can only occur for particles impinging on the crystal surface with well-defined impact conditions. For a 6.8 TeV proton beam, they must have an angle of 90° with angular deviations of at most ±0.0001° (around ±2 μrad) – which is similar to aiming at a 10 cm-wide snooker pocket from a shooting distance of 25 km! If this tiny angular acceptance is not respected, the transverse momentum is sufficient to send particles out of the potential well produced between the planes of the crystal lattice, thus losing the channelling condition. Both the beam-impact conditions and the accuracy of the crystal’s angle must therefore be kept under excellent control.
The crystal collimators are steered remotely using a technology that is unique to the CERN accelerator complex. It relies on a high-precision interferometer that provides suitable feedback to the advanced controller, and a precise piezo-actuation device that drives the crystal orientation with respect to impinging halo particles with unprecedented precision. During Run 2, the system demonstrated the sub-microradian accuracy required to maintain crystal channelling at high beam energy (see “High precision” figure, top). A recent feature of the newly installed devices is that the interferometer heads (which enable the precise control of the angle) are located outside the vacuum with the laser light coupled to the angular stage by means of viewports. This means that any fibre degradation due to motion or radiation, which was observed on the prototype system, can be corrected during routine maintenance. Using this setup in 2018, an improvement in ion-collimation cleaning by up to a factor of eight was demonstrated experimentally with the best crystal, paving the way for crystal collimation to become the baseline solution for the HL-LHC.
The test devices used during Run 2 served their purpose well, but they do not meet the standards required for regular, high-efficiency operation. An upgrade plan was therefore put in place to replace them with a higher performing new design. This has been developed in a crash programme at CERN that started in November 2020, when the decision to postpone the installation of the 11 T dipoles was taken. Two units were built and installed in the LHC in 2021 (see “On target” image) and another four are nearing completion: two for installation in the LHC at the end of 2022 and the others serving as operational spares. The first two installed units replaced the two prototype vertical crystals that showed the lowest performance. The horizontal prototype devices remain in place for 2022, since they performed well and were tested with a pilot beam in October 2021.
Improved ion-collimation cleaning has paved the way to adopt crystal collimation as the baseline of the HL-LHC
The start of Run 3 in April this year provided a unique opportunity to test the new devices with proton beams, ahead of the next operational ion run. One of the first challenges is to establish the optimal alignment of the crystals, to make sure stray particles are channelled as required. While channelled, the impinging particles interact with the crystal with the lowest nuclear-interaction rate: halo particles travel preferentially in the “empty” channel relatively far from the lattice nuclei. Optimum channelling is therefore revealed by the orientation that has the lowest losses, as measured by beam-loss monitors located immediately downstream of the crystal (see “High precision” figure, bottom). Considering the large angular range possible (more than 50 μrad, compared with the full angular range of 20 mrad), establishing this optimum condition is a bit like finding a needle in a haystack. However, following a successful campaign in dedicated operational beam tests in August 2022, channelling was efficiently established for both the new and old crystals, allowing the commissioning phase to continue.
Looking forward
The LHC collimation system is the most complex beam-cleaning system built to date for particle accelerators. However, it must be further improved to successfully face the upcoming challenges from the HL-LHC upgrade which, for heavy-ion beams, begins during Run 3. Crystal collimation is a crucial upgrade that is now being put into operation to improve the betatron cleaning in preparation for the upgraded ion-beam parameters, mitigating the risks of machine downtime from ion-beam losses. The collimation cleaning performance will be established experimentally as soon as Run 3 ion operation begins. Initial beam tests with protons indicate that the newly installed bent crystals perform well. The first measurements demonstrated that the crystals can be put into operation as expected and showed the specified channelling property. We are therefore confident that this advanced technology can be used successfully for the heavy-ion challenges of the HL-LHC programme.
What first drew you to physics, and to accelerators in particular?
In school I liked and did well in science and math. I liked the feeling of certainty of math. There is an objective truth in math. And I was fascinated by the fact that I could use math to describe physical phenomena, to capture the complexity of the world in elegant mathematical equations. I also had an excellent, rigorous high-school physics teacher, whom I admired.
Accelerators offered the possibility of addressing fundamental challenges in (accelerator) physics and technology, and getting verifiable results in a reasonable amount of time to have a material impact. In addition, particle accelerators enable research and discovery in a vast range of scientific fields (such as particle and nuclear physics, X-ray and neutron science) and societal applications such as cancer treatment and radioisotope production.
What was your thesis topic?
My PhD thesis tackled experimentally, theoretically and via computer simulations the nonlinear dynamics of transverse particle oscillations in the former Tevatron collider at Fermilab, motivated by planning for the Superconducting Super Collider. Nonlinearities were introduced in the Tevatron by special sextupole magnets. In a series of experiments, we obtained accurate measurements of various phase–space features with sextupoles switched on. One of the features was the experimental demonstration of “nonlinear resonance islands” – protons captured on fixed points in phase space.
What have been the most rewarding and challenging aspects of your career so far?
There are many rewarding aspects of what I do. Seeing an audience, especially young people, who light up when I explain a fascinating concept. Pointing to something tangible that I contributed towards that will enable scientists to make discoveries in accelerator, particle or nuclear physics. Having conceived, worked on and advocated certain types of accelerators and seeing them realised. Predicting a behaviour of the particle beam, and verifying it in experiments. Also, troubleshooting a serious problem, and after days and nights of toil, finding the origin of or solution to the problem.
In terms of challenges, at Fermilab right now we are working on very complex and challenging projects like LBNF/DUNE. It involves more than 1400 international collaborators preparing and building a technically complex endeavour almost one mile underground. The mere scale of the operation is enormous but the pay-off is completing something unprecedented and enabling groundbreaking discoveries.
What are your goals as Fermilab director?
First and foremost, the completion of LBNF/DUNE to advance neutrino physics. Also, the completion of the remainder of the 2014 “P5” programmes, including the HL-LHC upgrades of the accelerator and CMS detector, and a new experiment at Fermilab called Mu2e. Looking to the future, when the next P5 report is completed, we will launch the next series of projects. Quantum technology is also a growing focus. Fermilab hosts one of five national quantum centres in addition to being a partner in a second one. We utilise our world-leading expertise in superconducting radiofrequency technology and instrumentation/control to advance quantum information technologies, as well as conducting unique dark-matter searches using this expertise.
Is being director different to what you imaged?
It takes a lot of hard work to build an excellent team, exceeding my initial projection. But equally our staff’s commitment, good will and dedication have also exceeded my expectations.
Which collider should follow the LHC, and what is the role of the US/Fermilab in realising such a project?
No matter which collider is chosen, there is still a lot of R&D required for any path concerning magnets, radiofrequency cavities and detectors. This R&D is crucial to multiple applications. I would advocate the development of these capabilities to push the state of the art for accelerators and detectors in the near future. Future colliders are an important component of the current Snowmass/P5 community planning exercise. Here, Fermilab is aligned with the previous P5 and is committed to following the next P5 recommendations.
How would you describe high-energy physics today compared to when you entered the field?
In the 1980s, the major building blocks of the Standard Model were largely in place, and the focus of the field was to experimentally verify many of its predictions. Today, the Standard Model is much more thoroughly tested, but there is evidence that it does not completely describe the whole picture.
The upcoming century promises a fascinating array of ground-breaking discoveries
A lot of present-day research is about physics beyond the Standard Model, including dark matter, dark energy and the question of matter–antimatter asymmetry in the universe. In parallel, technologies have advanced tremendously since the 1980s, enabling unprecedented precision, parameter reach and new discoveries. This applies to accelerators, telescopes, detectors and computing. The upcoming century promises a fascinating array of groundbreaking discoveries, all of which will fundamentally further our understanding of the universe.
What can be done to ensure that there are more female laboratory directors worldwide?
I think it is important to increase the pipeline, starting with efforts to attract young people in elementary and high school. We need to look at changing the cultural perspective that women can’t do STEM and get to the point where the entire culture is open-minded. We also need to change the make-up of the committees.
It is important to encourage females to take on leadership positions and then support and empower them with enlightened mentors. Once they develop their careers, we will have a much bigger pool for future lab directors. We must inspire and empower young girls and women to follow their dreams, and help them stay focused to succeed.
The proposed electron–positron Higgs and electroweak factory FCC-ee, a major pillar of the Future Circular Collider (FCC) study, is a leading contender for a flagship project at CERN to follow the LHC. Envisaged to be housed in a 91 km-long tunnel in the Geneva region, and to be followed by a high-energy hadron collider utilising the same infrastructure, it is currently the subject of a technical and financial feasibility study, as recommended by the 2020 update of the European strategy for particle physics.
Maximising energy efficiency is a major factor in the FCC design. Two new projects backed by the Swiss Accelerator Research and Technology collaboration (CHART) seek to reduce the environmental impact of FCC-ee by exploring the use of high-temperature superconductors in core accelerator technologies.
Much like its predecessor, LEP, and indeed every lepton collider to date, the main magnet systems in the FCC-ee design are based on normal-conducting technology. While perfectly adequate from a magnetic-field point of view, normal-conducting magnets consume electricity through Ohmic heating. The FCC-ee focusing and defocusing elements, comprising about 3000 quadrupole magnets and 6000 sextupole magnets, are estimated to consume in excess of 50 MW when operating at the highest energies. This can be reduced if the magnetic systems are made superconducting, and if high-temperature superconductors (HTS) were to be used. Whereas conventional superconductors such as the niobium-titanium used in the LHC must be cooled to extremely low temperatures (1.9–4.5 K), state-of-the-art HTS materials can operate up to 90 K, significantly reducing the cryogenic power needed to keep them superconducting. The question remains if high-performing HTS accelerator magnets, with all their advantages on paper, can be built in practice.
Turning FCC-ee superconducting not only helps with operational costs and environmental credentials, but the new HTS technology has potential applications in everyday life
In April 2022, the CHART executive board gave the green light to two projects investigating the feasibility of superconducting technology for the main magnet systems of FCC-ee. CHART was founded in 2016 as an umbrella collaboration for R&D activities in Switzerland, with CERN, PSI, EPFL, ETH-Zurich and the University of Geneva as present partners. The larger HTS4 project, involving CERN and PSI, will focus on superconducting magnets, while CPES (Cryogenic Power Electronic Supply) will focus on cryogenic power supplies, with partners ETHZ and PSI.
The use of HTS-based magnets could dramatically reduce the power drawn by the main quadrupole and sextupole systems for FCC-ee when operating at the highest centre-of-mass energies, explains HTS4 principal investigator Michael Koratzinos of PSI. Furthermore, he says, since HTS magnets do not need iron to shape the magnetic field, they can be made much lighter and can be nested inside one another to increase performance and flexibility in the optics design. “Turning FCC-ee superconducting not only helps with the reduction in operational costs and the environmental credentials of the accelerator, but it also helps society develop this new and exciting HTS technology with potential applications in everyday life.”
High demand
HTS conductors are currently in high demand, mainly from a multitude of privately-funded fusion projects, such as the SPARC project at MIT. Their main disadvantage is their high cost, but this is expected to come down as demand picks up. SPARC needs about 10,000 km of HTS conductor during the next few years, compared to an estimated 20,000 km for FCC-ee, although on a later time scale.
The ultimate aim of HTS4 is the production of a full-size prototype of one of the FCC-ee short-straight sections based on HTS technology. Four work packages will address: integration with the rest of the FCC-ee accelerator systems; enabling technologies on peripheral issues such as impregnation; the conceptual and technical design of a short demonstrator and a prototype; and the design, construction and testing of the full prototype module.
“Any future project at CERN and elsewhere relies on innovative R&D to minimise its electricity consumption,” says project leader of the FCC study Michael Benedikt of CERN. “We are doing our utmost at FCC to increase our energy efficiency.”
High-energy physics labs like CERN rely on the products and services of countless hi-tech companies, many of whom were represented at this year’s Big Science Business Forum, held in Granada, Spain, from 4–7 October 2022.
In this video, you can hear from Kacper Matuszyński, sales manager for Teledyne SP Devices, which makes high-performance digitisers for data acquisition. Based in Sweden, the company has been part of the multi-billion-dollar Teledyne Technologies since 2017.
“We specialise in high-speed systems, focusing on niches such as mass spectrometry, lidar or medical imaging,” says Matuszyński, speaking at the meeting in Granada. “We are a highly R&D-focused company, developing new products and offering our customised services.”
At a ceremony in the CERN Globe on 27 September, the winners of “Mining the Future” – a competition co-organised by CERN and the University of Leoben to identify the best way to handle excavated materials from the proposed Future Circular Collider (FCC) project – were announced. Launched in June 2021 in the frame of the European Union co-funded FCC Innovation Study, Mining the Future invited experts beyond the physics community to seek sustainable ways of reusing the heterogeneous sedimentary rock that would need to be excavated for the FCC infrastructure, which is centered on a 91 km-circumference tunnel in the Geneva basin. Twelve proposals, submitted by consortia of universities, major companies and start-ups, were reviewed based on their technological readiness, innovative potential and socioeconomic impact.
Following final pitches in the Globe by the four shortlisted entrants, a consortium led by Swiss firm BG Ingenieurs Conseils was awarded first prize – including support to the value of €40,000 to bring the technology to maturity – for their proposal “Molasse is the New Ore”. Using a near real-time flow analysis that has been demonstrated in cement plants, the proposal would see excavated materials immediately identified and separated for further processing on-site, treating them not as waste that needs to be managed and thereby serving environmental objectives and efficiency targets.
The runners-up were proposals led by: Amberg (to sort, characterise and redistribute the molasse into fractions of known compositions and recycle each material on a large scale locally); Briques Technique Concept (to produce bricks from the excavated material for the construction of nearby buildings); and Edaphos (to process the molasses into topsoil-like material in a process known as soil conditioning). Although only one winner was chosen, it emerged during the ceremony that an integrated approach of all four shortlisted scenarios would be a valid scenario for managing the estimated 7–8 million m3 of molasse materials required for the FCC construction project.
“This is a key ingredient for the FCC feasibility study while also creating business opportunities for applying these technologies in different markets,” said competition creator Johannes Gutleber of CERN. “The proposals submitted in the course of the contest show that designing a new research infrastructure acts as an amplifier of ideas for society at large.”
Physics-based industries are as important to the Swiss economy as production or trade, concludes a new report by the Swiss Physical Society (SPS). Seeking to determine the impact of physics on Swiss society, and motivated by a similar Europe-wide study completed in 2019 by the European Physical Society, the SPS team, with support from the Swiss Academy of Natural Sciences and Swiss service-provider IMSD, carried out a statistical analysis revealing key indicators of the national value of physics.
Currently, states the report, the turnover of physics-based industries (PBI) in Switzerland is estimated to exceed CHF 274 billion in revenue, and is expected to grow further. PBI, defined as those industries that are strongly reliant on modern technologies developed by physicists, were divided into 11 categories ranging from pharmaceuticals and medical instruments to electricity supply and general manufacturing. The share of PBI in Switzerland’s gross value added (GVA) was found to be CHF 91.5 billion, or 13% of the total for 2019, while the number of full-time equivalent jobs was 417,000 (9.8%). Furthermore, the specific GVA for PBI increased by 6.3% from 2015 to 2019 – almost three times higher than the average increase among all economic-activity sectors during the same period.
Innovative ideas that come out of fundamental, curiosity-driven research are at the source of what leads to success in society
Hans Peter Beck
Not included in these figures are the contributions of physicists who are employed in other industries, nor additional economic impact due to downstream effects such as household spending associated with economic activity in PBI. Estimating the GVA multiplier associated with the impact of PBI to be between 2.31 and 2.49, the report concludes that every CHF 1.00 of direct physics-related output contributes CHF 2.31 to 2.49 to the economy-wide output. Beyond economic impact, the report also evaluated the contribution of education and innovation to Swiss society, and highlighted ways in which to address the shortage of skilled workers and the gender gap.
“The impact physics has on society has been studied multiple times in a variety of countries and all arrive at the same conclusion: economic success in a modern, technology-driven society is the fruit of long-term support for physics in education and research,” says former SPS president Hans Peter Beck. “Innovative ideas that come out of fundamental, curiosity-driven research are at the source of what leads to success in society.”