Topics

Yves Baconnier 1934–2024

Yves Baconnier, who made important technical and managerial contributions to a surprising number of CERN accelerators, passed away on 21 January 2024.

Born in 1934 in the Ardèche in the South of France, Yves completed his studies at the Institut Polytechnique de Grenoble. He joined CERN in 1963 as engineer-in-charge of the Proton Synchrotron (PS) and quickly took a strong interest in analysing and improving the slow-ejection procedure. He became leader of the machine study team before moving to the Super Proton Synchrotron (SPS) project in the early 1970s, where his experience with beam extraction was very welcome.

Once the SPS extraction was operational at the end of the 1970s, Yves moved on to the Large Electron Positron (LEP) collider and, in particular, its injection system, which at the beginning was imagined as a new system without a link to the existing accelerators. His decisive idea to use the deadtime between the proton cycles of the SPS – dictated by limited cooling power – to insert the low-dissipation e+e acceleration cycles from 3.5 to 20 GeV was the key element for accepting the existing accelerator chain PS and SPS with all its infrastructure as the LEP injector. This cut short all discussions on other possible LEP sites in Europe.

After this memorable success with LEP, Yves moved back to his first love, the PS, and took responsibility for the PS ring proper to define and oversee an upgrade programme enabling the elderly machine to accelerate electrons and positrons from 0.6 to 3.5 GeV. The complete vacuum system had to be modified to withstand the synchrotron radiation emitted from the lepton beams, and the campaign reached its climax during a very long shutdown in 1987, during which a stainless-steel vacuum chamber was installed around the ring. Since the PS magnets had combined-focussing, i.e. a quadrupole magnetic field on top of the dipole field, the synchrotron radiation would not allow for stable operation. To counter this, two Robinson-type wiggler magnets had to be inserted. Yves and his team designed this unique magnet, tested the prototype in the PS and at the DCI ring at LAL in Orsay and, finally, introduced it successfully in the PS.

In the early 1990s, Yves went on to join the teams designing a beauty factory to be housed in the tunnel of the former Intersecting Storage Rings, and took the lead in the design of a tau–charm factory to be built on a green-field site in Spain. However, since these projects did not then materialise, he continued his work at the CLIC test facility to which the linear accelerator of LEP had to be converted. In 1984, in parallel with his other widespread activities, Yves took an active interest in the LHC design study, leading to the project’s official approval in 1994. He moved into project management in the mid-1990s and was entrusted with chairing the influential LHC parameter committee until his retirement in 1999.

Yves will be remembered for his thorough and well-thought approach to his work, always seeking to understand ab initio, and for his meticulous insistence on checking hardware through prototyping and extended testing. Unassuming but sharp and exacting, he was a well-respected colleague, an appreciated lecturer and a leader with wide-ranging interests.

Stefano Catani 1958–2024

Stefano Catani, a theoretical particle physicist in the Florence section of Istituto Nazionale di Fisica Nucleare (INFN), passed away on 16 January 2024. Stefano was one of the world’s leading experts in quantum chromodynamics (QCD) and its phenomenological application to high-energy collider physics, leaving an irreplaceable void among his colleagues, friends and family.

Stefano studied physics at the University of Florence and obtained his PhD in 1987 under the supervision of Marcello Ciafaloni, who passed away in September 2023. He was a postdoctoral fellow at the University of Cambridge from 1989 to 1991, and a member of CERN’s theory division from 1991 to 1993. After 1993 he developed his scientific career at INFN Florence, with a period as a CERN staff member between 1997 and 2002.

Discussing physics with Stefano was a fantastic experience. His depth and vision were simply unique. He was one of the great pioneers in the development of QCD as a precision science, thanks to his extraordinary ability to embrace the entire field without interruption, from the physics of “soft” gluons and their resummation to the perturbative regime. His research achievements are internationally recognised as being fundamental to the success of the high-energy collider physics programme, in particular for precision studies of the Higgs boson and the top quark.

His work is recognised as fundamental to the success of the high-energy collider physics programme

Among his most important contributions are the formulation of jet clustering algorithms at lepton and hadron colliders (a key component of most experimental analyses), a general expression for the determination of the infrared singularities of scattering amplitudes (the so-called Catani formula), the design of general algorithms for the perturbative calculation of cross sections and differential observables, which have become a standard in the community (the well-known Catani–Seymour dipole subtraction and the qT subtraction schemes), and the innovative Catani–Krauss–Kuhn–Webber algorithm for Monte Carlo simulations of many-jet processes.

Stefano’s work was especially motivated by the application of QCD to collider data. He was convinced that our understanding of QCD singularities could be formulated in a way that any user could make a next-to-leading-order calculation of any suitable observable, not just dedicated calculations by experts. He also studied factorisation properties and coherence effects in the high-energy limit (the Catani–Ciafaloni–Fiorani–Marchesini equation) and proposed a generalisation of collinear factorisation that accounts for potential factorisation breaking effects at very high perturbative orders. The countless messages received from collaborators and colleagues all over the world, affected by the premature loss of a dear friend and extraordinary colleague, highlight Stefano’s great qualities of generosity, human warmth and scientific rigour that will be sorely missed by all.

Jacques Haissinski 1935–2024

Jacques Haissinski, who played an important role in major particle-physics experiments, passed away on 25 March 2024 at the age of 89. His father Moïse worked with Marie Curie and had been a long-time collaborator of her daughter Irène Joliot-Curie.

Jacques entered Ecole Normale Supérieure in 1954 and later went to Stanford, where he worked under Burton Richter on the pioneering Colliding Beam machine to collide electrons in flight using two storage rings. After his military service, Jacques joined the Laboratoire de l’Accélérateur Linéaire in Orsay, to undertake a doctorate on the AdA (Anello di Accumulazione) ring. Built in Frascati from an idea of Bruno Touschek to collide in-flight electrons and positrons stored in the same vacuum chamber, AdA had been brought to Orsay by Pierre Marin to take advantage of the high intensity of the linac beams. Jacques mastered all aspects of the ground-breaking experiment and succeeded in detecting the very first time-in-flight collisions in 1963.

In accelerator physics, following a discovery on the ACO ring at Orsay, Jacques published, in 1967, a basic paper on the longitudinal equilibrium of particles in a storage ring that contained the now widely used “Haïssinski equation”. He also collaborated with Stanford on the commissioning of SPEAR and later SLC, the very first and so-far only linear collider. In phenomenology, following Touschek, he led a programme on radiative corrections and later gave lectures on this subject in preparation for LEP at Ecole de Gif in 1989.

But the main scientific activity of Jacques Haïssinski was experimental particle physics. He took part in many experiments, directed theses in Orsay on ACO, and was spokesperson of the CELLO experiment at DESY. During the construction of LEP, Jacques served as chairperson of the LEP committee at CERN.

After LEP, Jacques turned his interests to astroparticle physics and cosmology, notably giving courses on the subject and collaborating on the EROS experiment and the Planck mission. During that time, he also took responsibilities in the management of Paris-Sud University (at Orsay), and later as a leader in IN2P3 and in the Saclay Laboratory DAPNIA (now IRFU). His leadership was greatly appreciated by the French high-energy physics community.

An outstanding teacher, Jacques also campaigned for the dissemination of knowledge to the public. He was a great humanist who was deeply concerned with social injustice and criminal wars. He presented his views publicly and believed that other physicists should do so. Generous with his precious time, he was always available to pass on his knowledge and vast scientific culture. He marked and inspired several generations of particle and accelerator physicists.

Mats Lindroos 1961–2024

Mats Lindroos

Mats Lindroos, who made major contributions to accelerator technology, passed away on 2 May 2024 aged just 62.

Mats received his PhD in subatomic physics from Chalmers University of Technology in Gothenburg, Sweden in 1993 under the supervision of Björn Jonson. As a PhD student he studied decay properties and hyperfine interactions from oriented nuclei, making use of the low-temperature nuclear orientation facilities at ISOLDE, Daresbury and Studsvik. He joined CERN as a research fellow in 1993 and became a staff member in 1995.

While at CERN, Mats filled a number of diverse roles including being responsible for PS Booster operation and the technical coordination of the ISOLDE facility. He was one of the driving forces behind the HIE-ISOLDE project that commenced construction in 2009 and is now one of the major accelerated radioactive beam facilities worldwide. While at CERN he also played leading roles in several European Union-supported design studies for future conceptual accelerator facilities: the nuclear-physics radioactive beam facility EURISOL and the beta-beam neutrino factory. 

In 2009, when Sweden and Denmark were selected to be the host countries for the European Spallation Source (ESS), Mats returned to his roots in Sweden on secondment from CERN, formally joining the ESS in 2015. As one of the earliest members of the ESS organisation, he was responsible for establishing the nascent accelerator organisation as well as the accelerator collaboration, set up as a CERN-like collaboration, between major European accelerator laboratories across 10 countries to undertake the technical design of this important part of the facility. Mats led the technical design for the 5 MW proton linac of the ESS, and from 2013 as head of the 100-strong accelerator division he led the linac project that is now in the late stages of construction and installation. Even after stepping down from his leadership roles because of illness, he enthusiastically accepted a new one to advise the ESS management. He was fully involved in the process, and undoubtedly would have been instrumental in guiding the future evolution of the facility.

He set up a CERN-like collaboration between major European accelerator laboratories across 10 countries

As a globally recognised expert on accelerator technology, Mats served on many committees in an advisory role, such as the IJC Lab strategic advisory board (France), IN2P3 scientific committee (France), J-PARC technical advisory committee (Japan), PIP-II Fermilab technical advisory committee (US) and CERN’s scientific policy committee. As an adjunct professor at Lund University he enjoyed teaching and supervising students in addition to his numerous research, management and committee roles. Despite all these work activities, Mats found time to oversee, together with his partner Anette, the construction of a house on the south Swedish coast, where they enjoyed walking, gardening and being active in the local community.

Mats has touched all our lives with his energy and passion for research, his creativity for new ideas, his worldly knowledge, his sense of humour, and most importantly, his humanity and kindness. He will be greatly missed by all of us who had the privilege to count him as a friend and colleague.

Shy charm mesons confound predictions

ALICE figure 1

In the past two decades, it has become clear that three-quark baryons and quark–antiquark mesons cannot describe the full spectrum of hadrons. Dozens of exotic states have been observed in the charm sector alone. These states are either interpreted as compact objects with four or five valence quarks or as hadron molecules, however, their inner structures remain uncertain due to the complexity of calculations in quantum chromodynamics (QCD) and the lack of direct experimental measurements of the residual strong interaction between charm and light hadrons. New femtoscopy measurement by the ALICE collaboration challenge theoretical expectations and the current understanding of QCD.

Femtoscopy is a well-established method for studying the strong interactions between hadrons. Experimentally, this is achieved by studying particle pairs with small relative momentum. In high-energy collisions of protons at the LHC, the distance between such hadrons at the time of production is about one femtometre, which is within the range of the strong nuclear force. From the momentum correlations of particle pairs, one extracts the scattering length, a0, which quantifies the final-state strong interaction between the two hadrons. By studying the momentum correlations of emitted particle pairs, it is possible to access the final-state interactions of even short-lived hadrons such as D mesons.

The scattering lengths are significantly smaller than the theoretical predictions

The ALICE collaboration has now, for the first time, measured the interaction of open-charm mesons (D+ and D*+) with charged pions and kaons for all the charge combinations. The momentum correlation functions of each system were measured in proton–proton collisions in the LHC at a centre-of-mass energy of 13 TeV. As predicted by heavy-quark spin symmetry, the scattering lengths of Dπ and D*π agree with each other, but they are found to be significantly smaller than the theoretical predictions (figure 1). This implies that the interaction between these mesons can be fully explained by the Coulomb force, and the contribution from strong interactions is negligible within experimental precision. The small measured values of the scattering length challenge our understanding of the residual strong force of heavy-flavour hadrons in the non-perturbative limit of QCD.

These results also have an important impact on the study of the quark–gluon plasma (QGP) – a deconfined state of matter created in ultra-relativistic heavy-ion collisions. The rescattering of D mesons with the other hadrons (mostly pions and kaons) created in such collisions was thought to modify the D-meson spectra, in addition to the modification expected from the QGP formation. The present ALICE measurement demonstrates, however, that the effect of rescattering is expected to be very small.

More precise and systematic studies of charm–hadron interactions will be carried out with the upgraded ALICE detector in the upcoming years.

LHCb targets rare radiative decay

LHCb figure 1

Rare radiative b-hadron decays are powerful probes of the Standard Model (SM) sensitive to small deviations caused by potential new physics in virtual loops. One such process is the decay of B0s→ μ+μγ. The dimuon decay of the B0s meson is known to be extremely rare and has been measured with unprecedented precision by LHCb and CMS. While performing this measurement, LHCb also studied the B0s→ μ+μγ decay, partially reconstructed due to the missing photon, as a background component of the B0s→ μ+μ process and set the first upper limit on its branching fraction to 2.0 × 10–9 at 95% CL (red arrow in figure 1). However, this search was limited to the high-dimuon-mass region, whereas several theoretical extensions of the SM could manifest themselves in lower regions of the dimuon-mass spectrum. Reconstructing the photon is therefore essential to explore the spectrum thoroughly and probe a wide range of physics scenarios.

The LHCb collaboration now reports the first search for the B0s→ μ+μγ decay with a reconstructed photon, exploring the full dimuon mass spectrum. Photon reconstruction poses additional experimental challenges, such as degrading the mass resolution of the B0s candidate and introducing additional background contributions. To cope with this ambitious search, machine-learning algorithms and new variables have been specifically designed with the aim of discriminating the signal among background processes with similar signatures. The analysis is performed separately for three dimuon mass ranges to exploit any differences along the spectrum, such as the ϕ(1020) meson contribution in the low invariant mass region. The μ+μγ invariant mass distributions of the selected candidates are fitted, including all background contributions and the B0s→ μ+μγ signal component. Figure 2 shows the fit for the lowest dimuon mass region.

LHCb figure 2

No significant signal of B0s→ μ+μγ is found in any of the three dimuon mass regions, consistent with the background-only hypothesis. Upper bounds on the branching fraction are set and can be seen as the black arrows in figure 1. The mass fit is also performed for the combined candidates of the three dimuon mass regions to set a combined upper limit on the branching fraction to 2.8 × 10–8 at 95% CL.

The SM theoretical predictions of b decays becomes particularly difficult to calculate when a photon is involved, and they have large uncertainties due to the B0s→ γ local form factors. The B0s→ μ+μγ decay provides a unique opportunity to validate the different theoretical approaches, which do not agree with each other, as shown by the coloured bands in figure 1. Theoretical calculations of the branching fractions are currently below the experimental limits. The upgraded LHCb detector and the increased luminosity of the LHC’s Run 3 is currently providing conditions for studying rare radiative b-hadron decays with greater precision and, eventually, for finding evidence for the B0s→ μ+μγ decay.

A logical freight train

Steven Weinberg was a logical freight train – for many, the greatest theorist of the second half of the 20th century. It is timely to reflect on his legacy, the scientific component of which is laid out in a new collection of his publications selected by theoretical physicist Michael Duff (Imperial College).

Six chapters cover Weinberg’s most consequential contributions to effective field theory, the Standard Model, symmetries, gravity, cosmology and short-form popular science writing. I can’t identify any notable omissions and I doubt many others would, though some may raise an eyebrow at the exclusion of his paper deriving the Lee–Weinberg bound. Duff brings each chapter to life with first-hand anecdotes and details that will delight those of us most greatly separated from historical events. I am relatively young, and only had one meaningful interaction with Steven Weinberg.  Though my contemporaries and I inhabit a scientific world whose core concepts are interwoven with, if not formed by, Steven Weinberg’s scientific legacy, unlike Michael Duff we are poorly qualified to comment historically on the ecosystem in which this legacy grew, nor on aspects of personality. This makes his commentary particularly valuable to younger readers.

I can envisage three distinct audiences for this new collection. The first is the lay theorist – those who are widely enough read to recognise the depth of Weinberg’s impact in theoretical physics and would like to know more. Such readers will find Duff’s introductions to be insightful and entertaining – helpful preparation for the more technical aspects of the papers, though expertise is required to fully grapple with many of them. There are also a few hand-picked non-technical articles one would otherwise not encounter without some serious investigative effort, including some accessible articles on quantum field theory, effective field theory and life in the multiverse, in addition to the dedicated section on popular articles. These will delight any theory afficionado.

The second audience is practising theorists. If you’re going to invest in a printed collection of publications, then Weinberg is an obvious protagonist. Particle theorists consult his articles so often that they may as well have them close at hand. This collection contains those most often revisited and ought to be useful in this respect. Duff’s introductions also expose technical interconnections between the articles that might otherwise be missed.

Steven Weinberg: Selected Papers

The third audience I have in mind are beginning graduate students in particle theory, cosmology and beyond. It would not be a mistake to put this collection on recommended reading lists. In due course, most students should read many of these papers multiple times, so why not get on with it from the get-go? The section on effective field theories (EFTs) contains many valuable key ideas and perspectives. Plenty of those core concepts are still commonly encountered more by osmosis than with any rigour, and this can lead to confused notions around the general approach of EFT. Perhaps an incomplete introduction to EFT could be avoided for graduate students by cutting straight to the fundamentals contained here? The cosmology section also reveals many important modern concepts alongside lucid and fearless wrestling with big questions. The papers on gravity detail techniques that are frequently encountered in any first foray into modern amplitudology, as well as strategies to infer general lessons in quantum field theory from symmetries and self-consistency alone.

In my view, however, the most important section for beginning graduate students is that on the construction of the Standard Model (SM). It may be said that a collective amnesia has emerged regarding the scientific spirit that drove its development. The SM was built by model builders. I don’t say this facetiously. They made educated guesses about the structure of the “ultraviolet” (microscopic) world based on the “infrared” (long-distance) breadcrumbs embedded within low-energy experimental observations. Decades after this swashbuckling era came to an end, there is a growing tendency to view the SM as something rigid, providentially bestowed and permanent. The academic bravery and risk-taking that was required to take the necessary leaps forward then, and which may be required now, is no better demonstrated than in “A Model of Leptons”. All young theorists should read this multiple times. A Model of Leptons exemplifies that not only was Steven Weinberg an unstoppable force of logic, but also a plucky risk taker. It’s inspirational that its final paragraph, which laid out the structure of nature at the electroweak scale, ends with doubt and speculation: “And if this model is renormalisable, then what happens when we extend it to include the couplings of A and B to the hadrons?” By working their way through this collection, graduate students may be inspired to similar levels of ambition and jeopardy.

Amongst the greatest scientists of the last century

In the weeks that followed the passing of Stephen Weinberg, I sensed amongst a number of colleagues of all generations some moods that I could have anticipated; of the loss of not only a bona fide truth-seeker, but also of a leader, frequently the leader. I also perceived a feeling that transcended the scientific realm alone, of someone whose creative genius ought to be recognised amongst the greatest of scientists, musicians, artists and humanity of the last century. How can we productively reflect on that? I imagine we would all do well to learn not only of Weinberg’s important individual scientific insights, but also to attempt to absorb his overall methodology in identifying interesting questions, in breaking new trails in fundamental physics, and in pursuing logic and clarity wherever they may take you. This collection is not a bad place to start.

ATLAS turbocharges event simulation

ATLAS figure 1

As the harvest of data from the LHC experiments continues to increase, so does the required number of simulated collisions. This is a resource-intensive task as hundreds of particles must be tracked through complex detector geometries for each simulated physics collision – and Monte Carlo statistics must typically exceed experimental statistics by a factor of 10 or more, to minimise uncertainties when measured distributions are compared with theoretical predictions. To support data taking in Run 3 (2022–2025), the ATLAS collaboration therefore developed, evaluated and deployed a wide array of detailed optimisations to its detector-simulation software.

The production of simulated data begins with the generation of particles produced within the LHC’s proton–proton or heavy-ion collisions, followed by the simulation of their propagation through the detector and the modelling of the electronics signals from the active detection layers. Considerable computing resources are incurred when hadrons, photons and electrons enter the electromagnetic calorimeters and produce showers with many secondary particles whose trajectories and interactions with the detector material must be computed. The complex accordion geometry of the ATLAS electromagnetic calorimeter makes the Geant4 simulation of the shower development in the calorimeter system particularly compute-intensive, accounting for about 80% of the total simulation time for a typical collision event.

Since computing costs money and consumes electrical power, it is highly desirable to speed up the simulation of collision events without compromising accuracy. For example, considerable CPU resources were previously spent in the transportation of photons and neutrons; this has been mitigated by randomly removing 90% of the photons (neutrons) with energy below 0.5 (2) MeV and scaling up the energy deposited from the remaining 10% of low-energy particles. The simulation of photons in the finely segmented electromagnetic calorimeter took considerable time because the probabilities for each possible interaction process were calculated every time photons crossed a material boundary. That calculation time has been greatly reduced by using a uniform geometry with no photon transport boundaries and by determining the position of simulated interactions using the ratio of the cross sections in the various material layers. The combined effect of the optimisations brings an average speed gain of almost a factor of two.

ATLAS has also successfully used fast-simulation algorithms to leverage the available computational resources. Fast simulation aims at avoiding the compute-expensive Geant4 simulation of calorimeter showers by using parameterised models that are significantly faster and retain most of the physics performance of the more detailed simulation. However, one of the major limitations of the fast simulation employed by ATLAS during Run 2 was the insufficiently accurate modelling of physics observables such as the detailed description of the substructure of jets reconstructed with large-radius clustering algorithms.

AtlFast3 offers fast, high-precision physics simulations

For Run 3, ATLAS has developed a completely redesigned fast simulation toolkit, known as AtlFast3, which performs the simulation of the entire ATLAS detector. While the tracking systems continue to be simulated using Geant4, the energy response in the calorimeters is simulated using a hybrid approach that combines two new tools: FastCaloSim and FastCaloGAN.

FastCaloSim parametrises the longitudinal and lateral development of electromagnetic and hadronic showers, while the simulated energy response from FastCaloGAN is based on generative adversarial neural networks that are trained on pre-simulated Geant4 showers. AtlFast3 effectively combines the strengths of both approaches by selecting the most appropriate algorithm depending on the properties of the shower-initiating particles, tuned to optimise the performance of reconstructed observables, including those exploiting jet substructure. As an example, figure 1 shows that the hybrid AtlFast3 approach models the number of constituents of reconstructed jets as simulated with Geant4 very accurately.

With its significantly improved physics performance and a speedup between a factor of 3 (for Z ee events) and 15 (for high-pT di-jet events), AtlFast3 will play a crucial role in delivering high-precision physics simulations of ATLAS for Run 3 and beyond, while meeting the collaboration’s budgetary compute constraints.

The inventive pursuit of UHF gravitational waves

Since their first direct detection in 2015, gravitational waves (GWs) have become pivotal in our quest to understand the universe. The ultra-high-frequency (UHF) band offers a window to discover new physics beyond the Standard Model (CERN Courier March/April 2022 p22). Unleashing this potential requires theor­etical work to investigate possible GW sources and experiments with far greater sensitivities than those achieved today.

A workshop at CERN from 4 to 8 December 2023 leveraged impressive experimental progress in a range of fields. Attended by nearly 100 international scientists – a noteworthy increase from the 40 experts who attended the first workshop at ICTP Trieste in 2019 – the workshop showcased the field’s expanded research interest and collaborative efforts. Concretely, about 10 novel detector concepts have been developed since the first workshop.

One can look for GWs in a few different ways: observing changes in the space between detector components, exciting vibrations in detectors, and converting GWs into electromagnetic radiation in strong magnetic fields. Substantial progress has been made in all three experimental directions.

Levitating concepts

The leading concepts for the first approach involve optically levitated sensors such as high-aspect-ratio sodium–cyttrium–fluoride prisms, and semi-levitated sensors such as thin silicon or silicon–nitride nanomembranes in long optical resonators. These technologies are currently under study by various groups in the Levitated Sensor Detectors collaboration and at DESY.

For the second approach, the main focus is on millimetre-scale quartz cavities similar to those used in precision clocks. A network of such detectors, known as GOLDEN, is being planned, involving collaborations among UC Davis, University College London and Northwestern University. Superconducting radio-frequency cavities also present a promising technology. A joint effort between Fermilab and DESY is leveraging the existing MAGO prototype to gain insights and design further optimised cavities.

Regarding the third approach, a prominent example is optical high-precision interferometry, combined with a series of accelerator dipole magnets similar to those used in the light-shining-through-a-wall axion-search experiment, ALPS II (Any Light Particle Search II) or the axion helioscope CAST and its planned successor IAXO. In fact, ALPS II is anticipated to commence a dedicated GW search in 2028. Additionally, other notable concepts inspired by axion dark-matter searches involve toroidal magnets, exemplified by experiments like ABRACADABRA, or solenoidal magnets such as BASE or MADMAX.

All three approaches stand to benefit from burgeoning advances in quantum sensing, which promise to enhance sensitivities by orders of magnitude. In this landscape, axion dark-matter searches and UHF GW detection are poised to work in close collaboration, leveraging quantum sensing to achieve unprecedented results. Concepts that demonstrate synergies with axion-physics searches are crucial at this stage, and can be facilitated by incremental investments. Such collaboration builds awareness within the scientific community and presents UHF searches as an additional, compelling science case for their construction.

The workshop showcased the fields expanded research interest and collaborative efforts

Cross-disciplinary research is also crucial to understand cosmological sources and constraints on UHF GWs. For the former, our understanding of primordial black holes has significantly matured, transitioning from preliminary estimates to a robust framework. Additional sources, such as parabolic encounters and exotic compact objects, are also gaining clarity. For the latter, the workshop highlighted how strong magnetic fields in the universe, such as those in extragalactic voids and planetary magnetospheres, can help set limits on the conversion between electromagnetic and gravitational waves.

Despite much progress, the sensitivity needed to detect UHF GWs remains a visionary goal, requiring the constant pursuit of inventive new ideas. To aid this, the community is taking steps to be more inclusive. The living review produced after the first workshop (arXiv:2011.12414) will be revised to be more accessible for people outside our community, breaking down detector concepts into fundamental building blocks for easier understanding. Plans are also underway to establish a comprehensive research repository and standardise data formats. These initiatives are crucial for fostering a culture of open innovation and expanding the potential for future breakthroughs in UHF GW research. Finally, a new, fully customisable and flexible GW plotter including the UHF frequency range is being developed to benefit the entire GW community.

The journey towards detecting UHF GWs is just beginning. While current sensitivities are not yet sufficient, the community’s commitment to developing innovative ideas is unwavering. With the collective efforts of a dedicated scientific community, the next leap in gravitational-wave research is on the horizon. Limits exist to be surpassed!

New Challenges and Opportunities in Physics Education

New Challenges and Opportunities in Physics Education

New Challenges and Opportunities in Physics Education presents itself as a guidebook for high-school physics educators who are navigating modern challenges in physics education. But whether you’re teaching the next generation of physicists, exploring the particles of the universe, or simply interested in the evolution of physics education, this book promises valuable insights. It doesn’t aim to cater to all equally, but rather to offer a spark of inspiration to a broad spectrum of readers.

The book is structured in two distinctive sections on modern physics topics and the latest information and communication technologies (ICTs) for classrooms. The editors bring together a diverse blend of expertise in modern physics, physics education and interdisciplinary approaches. Marilena Streit-Bianchi and Walter Bonivento are well known names in high-energy physics, with long and successful careers at CERN. In parallel, Marisa Michelini and Matteo Tuveri are pushing the limits of physics education with modern educational approaches and contemporary topics. All four are committed to making physics education engaging and relevant to today’s students.

The first part presents the core concepts of contemporary physics through a variety of narrative techniques, from historical recounting to imaginary dialogues, providing educators with a toolbox of resources to engage students in various learning scenarios. Does the teacher want to “flip the classroom” and assign some reading? They can read about the scientific contributions of Enrico Fermi by Salvatore Esposito. Does the teacher want to encourage discussions? Mariano Cadoni and Mauro Dorato have got their back with a unique piece “Gravity between Physics and Philosophy”, which can support interdisciplinary classroom discussions.

The second half of the book starts with an overview of ICT resources and classical physics examples on how to use them in a classroom setting. The authors then explore the skills that teachers and students need to effectively use ICTs. The transition to ICT feels a bit too long, and the book struggles to weave the two sections into a cohesive narrative, but the second half nevertheless captures the title of the book perfectly – ICTs are the epitome of new opportunities in physics education. While much has been said about them in other works, this book offers a cherry-picked but well rounded collection of ideas for enhancing educational experiences.

The authors not only emphasise modern physics and technology, but also another a very important characteristic of modern science: collaboration. This is an important message that we need to convey to students, as mere historical examples from classical physics sometimes show an elitist view of physics. Lone-genius narratives are often explicitly transitioned to a collaborative understanding of breakthroughs.

The book would not be complete without input from actual teachers. One notable contribution is by Michael Gregory, a particle-physics educator who shares his experiences with distance learning together with Steve Goldfarb, the former IPPOG co-chair. During the pandemic, he used online tools to convey physics concepts not only to his own students, but to students and teachers around the world. As such, his successful virtual science camps and online particle-physics courses reached frequently overlooked  audiences in remote locations.

Overall, New Challenges and Opportunities in Physics Education emerges as a valuable resource for a diverse audience. It is a guidebook for educators searching for innovative strategies to spice up their physics teachings or to better weave modern science into their lessons. Although it might fall short of flawlessly joining the modern-physics content with educational elements in the second half, its value is undeniable. The first part, in particular, serves as a treasure trove not only for educators but also for science communicators and even particle physicists seeking to engage with the public, using the common ground of high-school physics knowledge.

bright-rec iop pub iop-science physcis connect