Comsol -leaderboard other pages

Topics

Standardising sustainability: step one

For a global challenge like environmental sustainability, the only panacea is international cooperation. In September, the Sustainability Working Group, part of the Laboratory Directors Group (LDG), took a step forward by publishing a report for standardising the evaluation of the carbon impact of accelerator projects. The report challenges the community to align on a common methodology for assessing sustainability and defining a small number of figures of merit that future accelerator facilities must report.

“There’s never been this type of report before,” says Maxim Titov (CEA Saclay), who co-chairs the LDG Sustainability Working Group. “The LDG Working Group consisted of representatives with technical expertise in sustainability evaluation from large institutions including CERN, DESY, IRFU, INFN, NIKHEF and STFC, as well as experts from future collider projects who signed off on the numbers.”

The report argues that carbon assessment cannot be left to the end of a project. Instead, facilities must evaluate their lifecycle footprint starting from the early design phase, all the way through construction, operation and decommissioning. Studies already conducted on civil-engineering footprints of large accelerator projects outline a reduction potential of up to 50%, says Titov.

In terms of accelerator technology, the report highlights cooling, ventilation, cryogenics, the RF cavities that accelerate charged particles and the klystrons that power them, as the largest sources of inefficiency. The report places particular emphasis on klystrons, and identifies three high-efficiency designs currently under development that could boost the energy efficiency of RF cavities from 60 to 90% (CERN Courier May/June 2025 p30).

Carbon assessment cannot be left to the end of a project

The report also addresses the growing footprint of computing and AI. Training algorithms on more efficient hardware and adapting trigger systems to reduce unnecessary computation are identified as ways to cut energy use without compromising scientific output.

“You need to perform a life-cycle assessment at every stage of the project in order to understand your footprint, not just to produce numbers, but to optimise design and improve it in discussions with policymakers,” emphasises Titov. “Conducting sustainability assessments is a complex process, as the criteria have to be tailored to the maturity of each project and separately developed for scientists, policymakers, and society applications.”

Established by the CERN Council, the LDG is an international coordination body that brings together directors and senior representatives of the world’s major accelerator laboratories. Since 2021, the LDG has been composed of five expert panels: high-field magnets, RF structures, plasma and laser acceleration, muon colliders and energy-recovery linacs. The Sustainability Working Group was added in January 2024.

NuFact prepares for a precision era

The 26th edition of the International Workshop on Neutrinos from Accelerators (NuFact) attracted more than 200 physicists to Liverpool from 1 to 6 September. There was no shortage of topics to discuss. Delegates debated oscillations, scattering, accelerators, muon physics, beyond-PMNS physics, detectors, and inclusion, diversity, equity, education and outreach (IDEEO).

Neutrino physics has come a long way since the discovery of neutrino oscillations in 1998. Experiments now measure oscillation parameters with a precision of a few per cent. At NuFact 2025, the IceCube collaboration reported new oscillation measurements using atmospheric neutrinos from 11 years of observations at the South Pole. The measurements achieve world-leading sensitivity on neutrino mixing angles, alongside new constraints on the unitarity of the neutrino mixing matrix. Meanwhile, the JUNO experiment in China celebrated the start of data-taking with its liquid-scintillator detector (see “JUNO takes aim at neutrino-mass hierarchy”). JUNO will determine the neutrino mass ordering by observing the fine oscillation patterns of antineutrinos produced in nuclear reactors.

Neutrino scattering

Beyond oscillations, a major theme of the conference was neutrino scattering. Although neutrinos are the most abundant massive particles in the universe, their interactions with matter remain poorly understood. Measuring and modelling these processes is essential: they probe nuclear structure and hadronic physics in a novel way, while also providing the foundation for oscillation analyses in current and next-generation experiments. Exciting advances were reported across the field. The SBND experiment at Fermilab announced the collection of around three million neutrino interactions using the Booster Neutrino Beam. ICARUS presented its first neutrino–argon cross-section measurement. MicroBooNE, MINERvA and T2K showcased new results on neutrino–nucleus interaction and compared them with theoretical models. The e4ν collaboration highlighted electron beams as potential sources of data to refine neutrino-scattering models, supporting efforts to achieve the detailed interaction picture needed for the coming precision era of oscillation physics. At higher energies, FASER and SND@LHC showcased their LHC neutrino observations with both emulsion and electronic detectors.

Neutrino physics is one of the most vibrant and global areas of particle physics today

CERN’s role in neutrino physics was on display throughout the conference. Beyond the results from ICARUS, FASER and SND@LHC, other contributions included the first observation of neutrinos in the ProtoDUNE detectors, the status of the MUonE experiment – aimed at measuring the hadronic contribution to the muon anomalous magnetic moment – and the latest results from NA61. The role of CERN’s Neutrino Platform was also highlighted in contributions about the T2K ND280 near-detector upgrade and the WAGASCI–BabyMIND detector, both of which were largely assembled and tested at CERN. Discussions featured the results of the Water Cherenkov Test Experiment, which operated in the T9 beamline to prototype technology for Hyper-Kamio­kande, and other novel CERN-based ideas, such as nuSCOPE – a proposal for a short-baseline experiment that would “tag” individual neutrinos at production, formed from the merging of ENUBET and NuTag. Building on a proof-of-principle result from NA62, which identified a neutrino candidate via its parent kaon decay, this technique could represent a paradigm shift in neutrino beam characterisation.

NuFact 2025 reinforced the importance of diversity and inclusion in science. The IDEEO working group led discussions on how varied perspectives and equitable participation strengthen collaboration, improve problem solving and attract the next generation of researchers. Dedicated sessions on education and outreach also highlighted innovative efforts to engage wider communities and ensure that the future of neutrino physics is both scientifically robust and socially inclusive. From precision oscillation measurements to ambitious new proposals, NuFact 2025 demonstrated that neutrino physics is one of the most vibrant and global areas of particle physics today.

Mainz muses on future of kaon physics

The 13th KAONS conference convened almost 100 physicists in Mainz from 8 to 12 September. Since the first edition took place in Vancouver in 1988, the conference series has returned roughly every three years to bring together the global kaon-physics community. This edition was particularly significant, being the first since the decision not to continue CERN’s kaon programme with the proposed HIKE experiment (CERN Courier May/June 2024 p7).

CERN’s current NA62 effort was nevertheless present in force. Eight presentations spanned its wide-ranging programme, from precision studies of rare kaon decays to searches for lepton-flavour and lepton-number violation, and explorations beyond the Standard Model (SM). Complementary perspectives came from Japan’s KOTO experiment at J-PARC, from multipurpose facilities such as KLOE-2, Belle II and CERN’s LHCb experiment, as well as from a large and engaged theoretical community. Together, these contributions underscored the vitality of kaon physics: a field that continues to test the SM at the highest levels of precision, with a strong potential to uncover new physics.

NA62 reported a big success on the so-called “golden mode” ultra-rare decay K+ π+νν, a process that is highly sensitive to new physics (CERN Courier July/August 2024 p30). NA62 has already delivered remarkable progress in this domain: by analysing data up to 2022, the collaboration more than doubled its sample from 20 to 51 candidate events, achieving the first 5σ observation of the decay (CERN Courier November/December 2024 p11). This is the smallest branching fraction ever measured, and, intriguingly, shows a mild 1.7σ tension with the Standard Model prediction, which itself is known with a 2% theoretical uncertainty. With the experiment continuing to collect data until CERN’s next long shutdown (LS3), NA62’s final dataset is expected to triple the current statistics, sharpening what is already one of the most stringent tests of the SM.

Another major theme was the study of rare B-meson decays where kaons often appear in the final state, for example B  K* ( Kπ) ℓ+. Such processes are central to the long-debated “B anomalies,” in which certain branching fractions of rare semileptonic B decays show persistent tensions between experimental results and SM predictions (CERN Courier January/February 2025 p14). On the experimental front, CERN’s LHCb experiment continues to lead the field, delivering branching-fraction measurements with unprecedented precision. Progress is also being made on the theoretical side, though significant challenges remain in matching this precision. The conference highlighted new approaches reducing uncertainties and biases, based both on phenomenological techniques and lattice QCD.

Kaon physics is in a particularly dynamic phase. Theoretical predictions are reaching unprecedented precision, and two dedicated experiments are pushing the frontiers of rare kaon decays. At CERN, NA62 continues to deliver impactful results, even though plans for a next-stage European successor did not advance this year. Momentum is building in Japan, where the proposed KOTO-II upgrade, if approved, would secure the long-term future of the programme. Just after the conference, the KOTO-II collaboration held its first in-person meeting, bringing together members from both KOTO and NA62 – a promising sign for continued cross-fertilisation. Looking ahead, sustaining two complementary experimental efforts remains highly desirable: independent cross-checks and diversified systematics. Both will be essential to fully exploit the discovery potential of rare kaon decays.

ICFA meets in Madison

Once a year, the International Committee for Future Accelerators (ICFA) assembles for an in-person meeting, typically attached to a major summer conference. The 99th edition took place on 24 August at the Wisconsin IceCube Particle Astrophysics Center in downtown Madison, one day before Lepton–Photon 2025.

While the ICFA is neither a decision-making body nor a representation of funding agencies, its mandate assigns to the committee the important task of promoting international collaboration and coordination in all phases of the construction and exploitation of very-high-energy accelerators. This role is especially relevant in today’s context of strategic planning and upcoming decisions – with the ongoing European Strategy update, the Chinese decision process on CEPC in full swing, and the new perspectives emerging on the US–American side with the recent National Academy of Sciences report (CERN Courier September/October 2025 p10).

Consequently, the ICFA heard presentations on these important topics and discussed priorities and timelines. In addition, the theme of “physics beyond colliders” – and with it, the question of maintaining scientific diversity in an era of potentially vast and costly flagship projects – featured prominently. In this context, the importance of national laboratories capable of carrying out mid-sized particle-physics experiments was underlined. This also featured in the usual ICFA regional reports.

An important part of the work of the committee is carried out by the ICFA panels – groups of experts in specific fields of high relevance. The ICFA heard reports from the various panel chairs at the Wisconsin meeting, with a focus on the Instrumentation, Innovation and Development panel, where Stefan Söldner-Rembold (Imperial College London) recently took over as chair, succeeding the late Ian Shipsey. Among other things, the panel organises several schools and training events, such as the EDIT schools, as well as prizes that increase recognition for senior and early-career researchers working in the field of instrumentation.

Maintaining scientific diversity in an era of potentially vast and costly flagship projects  featured prominently

Another focus was the recent work of the Data Lifecycle panel chaired by Kati Lassila-Perini (University of Helsinki). This panel, together with numerous expert stakeholders in the field, recently published recommendations for best practices for data preservation and open science in HEP, advocating the application of the FAIR principles of findability, accessibility, interoperability and reusability at all levels of particle-physics research. The document provides guidance for researchers, experimental collaborations and organisations on implementing best-practice routines. It will now be distributed as broadly as possible and will hopefully contribute to the establishment of open and FAIR science practices.

Formally, the ICFA is a working group of the International Union for Pure and Applied Physics (IUPAP) and is linked to Commission C11, Particles and Fields. IUPAP has recently begun a “rejuvenation” effort that also involves rethinking the role of its working groups. Reflecting the continuity and importance of the ICFA’s work, Marcelo Gameiro Munhoz, chair of C11, presented a proposal to transform the ICFA into a standing committee under C11 – a new type of entity within IUPAP. This would allow ICFA to overcome its transient nature as a working group.

Finally, there were discussions on plans for a new set of ICFA seminars – triennial events in different world regions that assemble up to 250 leaders in the field. Following the 13th ICFA Seminar on Future Perspectives in High-Energy Physics, hosted by DESY in Hamburg in late 2023, the baton has now passed to Japan, which is finalising the location and date for the next edition, scheduled for late 2026.

Invisibles, in sight

Around 150 researchers gathered at CERN from 1 to 5 September to discuss the origin of the observed matter–antimatter asymmetry in the universe, the source of its accelerated expansion, the nature of dark matter and the mechanism behind neutrino masses. The vibrant atmosphere of the annual meeting of the Invisibles research network encouraged lively discussions, particularly among early-career researchers.

Marzia Bordone (University of Zurich) highlighted central questions in flavour physics, such as the tensions in the determinations of quark flavour-mixing parameters and the anomalies in leptonic and semileptonic B-meson decays (CERN Courier January/February 2025 p14). She showed that new bosons beyond the Standard Model that primarily interact with the heaviest quarks are theoretically well motivated and could be responsible for these flavour anomalies. Bordone emphasised that collaboration between experiment and theory, as well as data from future colliders like FCC-ee, will be essential to understand whether these effects are genuine signs of new physics.

Lina Necib (MIT) shared impressive new results on the distribution of galactic dark matter. Though invisible, dark matter interacts gravitationally and is present in all galaxies across the universe. Her team used exquisite data from the ESA Gaia satellite to track stellar trajectories in the Milky Way and determine the local dark-matter distribution to within 20–30% precision – which means about 300,000 dark-matter particles per cubic metre assuming they have mass similar to that of the proton. This is a huge improvement over what could be done just one decade ago, and will aid experiments in their direct search for dark matter in laboratories worldwide.

The most quoted dark-matter candidates at Invisibles25 were probably axions: particles once postulated to explain why the strong interactions that bind protons and neutrons behave in the same way for particles and antiparticles. Nicole Righi (King’s College London) discussed how these particles are ubiquitous in string theory. According to Righi, their detection may imply a hot Big Bang, with a rather late thermal stage, or hint at some special feature of the geometry of ultracompact dimensions related to quantum gravity.

The most intriguing talk was perhaps the CERN colloquium given by the 2011 Nobel laureate Adam Riess (Johns Hopkins University). By setting up an impressive system of distance measurements to extragalactic systems, Riess and his team have measured the expansion rate of the universe – the Hubble constant – with per cent accuracy. Their results indicate a value about 10% higher than that inferred from the cosmic microwave background within the standard ΛCDM model, a discrepancy known as the “Hubble tension”. After more than a decade of scrutiny, no single systematic error appears sufficient to account for it, and theoretical explanations remain tightly constrained (CERN Courier March/April 2025 p28). In this regard, Julien Lesgourgues (RWTH Aachen University) pointed out that, despite the thousands of papers written on the Hubble tension, there is no compelling extension of ΛCDM that could truly accommodate it.

While 95% of the universe’s energy density is invisible, the community studying it is very real. Invisibles now has a long history and is based on three innovative training networks funded by the European Union, as well as two Marie Curie exchange networks. The network includes more than 100 researchers and 50 PhD students spread across key beneficiaries in Europe, as well as America, Asia and Africa – CERN being one of their long-term partners. The energy and enthusiasm of the participants at this conference were palpable, as nature continues to offer deep mysteries that the Invisibles community strives to unravel.

Higgs hunters revel in Run 3 data

The 15th Higgs Hunting workshop took place from 15 to 17 July at IJCLab in Orsay and LPNHE in Paris. It offered an opportunity to about 100 participants to step back and review the most recent LHC Run 2 and 3 Higgs-boson results, together with some of the latest theoretical developments.

One of the highlights concerned the Higgs boson’s coupling to the charm quark, with the CMS collaboration presenting a new search using Higgs production in association with a top–antitop pair. The analysis, targeting Higgs decays into charm–quark pairs, reached a sensitivity comparable to the best existing direct constraints on this elusive interaction. New ATLAS analyses showcased the impact of the large Run 3 dataset, hinting at great potential for Higgs physics in the years to come – for example, Run 3 data has reduced the uncertainties on the coupling of the Higgs boson to muons and Zγ by 30% and 38%, respectively. On the di-Higgs front, the expected upper limit on the signal-strength modifier, measured in the bbγγ final state only, has now surpassed in sensitivity the combination of all Run 2 HH channels (see “A step towards the Higgs self-coupling”). The sensitivity to di-Higgs production is expected to improve significantly during Run 3, raising hopes of seeing a signal before the next long shutdown, from mid-2026 to the end of 2029.

Juan Rojo (Vrije Universiteit Amsterdam) discussed parton distribution functions for Higgs processes at the LHC, while Thomas Gehrmann (University of Zurich) reviewed recent developments in general Higgs theory. Mathieu Pellen (University of Freiburg) provided a review of vector-boson fusion, Jose Santiago Perez (University of Granada) summarised the effective field theory framework and Oleksii Matsedonskyi (University of Cambridge) reviewed progress on electroweak phase transitions. In his “vision” talk, Alfredo Urbano (INFN Rome) discussed the interplay between Higgs physics and early-universe cosmology. Finally, Benjamin Fuks (LPTHE, Sorbonne University) presented a toponium model, bringing the elusive romance of top–quark pairs back into the spotlight (CERN Courier September/October 2025 p9).

After a cruise on the Seine in the light of the Olympic Cauldron, participants were propelled toward the future during the European Strategy for Particle Physics session. The ESPPU secretary Karl Jakobs (University of Freiburg) and various session speakers set the stage for spirited and vigorous discussions of the options before the community – in particular, the scenarios to pursue should the FCC programme, the clear plan A, not be realised. The next Higgs Hunting workshop will be held in Orsay and Paris from 16 to 18 September 2026.

All aboard the scalar adventure

Since the discovery of the Higgs boson in 2012, the ATLAS and CMS collaborations have made significant progress in scrutinising its properties and interactions. So far, measurements are compatible with an elementary Higgs boson, originating from the minimal scalar sector required by the Standard Model. However, current experimental precision leaves ample room for this picture to change. In particular, the full potential of the LHC and its high-luminosity upgrade to search for a richer scalar sector beyond the Standard Model (BSM) is only beginning to be tapped.

The first Workshop on the Impact of Higgs Studies on New Theories of Fundamental Interactions, which took place on the Island of Capri, Italy, from 6 to 10 October 2025, gathered around 40 experimentalists and theorists to explore the pivotal role of the Higgs boson in exploring BSM physics. Participants discussed the implications of extended scalar sectors and the latest ATLAS and CMS searches, including current potential anomalies in LHC data.

“The Higgs boson has moved from the realm of being just a new particle to becoming a tool for searches for BSM particles,” said Greg Landsberg (Brown University) in an opening talk.

An extended scalar sector can address several mysteries in the SM. For example, it could serve as a mediator to a hidden sector that includes dark-matter particles, or play a role in generating the observed matter–antimatter asymmetry during an electroweak phase transition. Modified or extended Higgs sectors also arise in supersymmetric and other BSM models that address why the 125 GeV Higgs boson is so light compared to the Planck mass – despite quantum corrections that should drive it to much higher scales – and might shed light on the perplexing pattern of fermion masses and flavours.

One way to look for new physics in the scalar sector is modifications in the decay rates, coupling strengths and CP-properties of the Higgs boson. Another is to look for signs of additional neutral or charged scalar bosons, such as those predicted in longstanding two-Higgs-doublet or Higgs-triplet models. The workshop saw ATLAS and CMS researchers present their latest limits on extended Higgs sectors, which are based on an increasing number of model-independent or signature-based searches. While the data so far are consistent with the SM, a few mild excesses have attracted the attention of some theorists.

In diphoton final states, a slight excess of events persists in CMS data at a mass of 95 GeV. Hints of a small excess at a mass of 152 GeV are also present in ATLAS data, while a previously reported excess at 650 GeV has faded after full examination of Run 2 data. Workshop participants also heard suggestions that the Brout–Englert–Higgs potential could allow for a second resonance at 690 GeV.

The High-Luminosity LHC will enable us to explore the scalar sector in detail

“We haven’t seen concrete evidence for extended Higgs sectors, but intriguing features appear in various mass scales,” said CMS collaborator Sezen Sekmen (Kyungpook National University). “Run 3 ATLAS and CMS searches are in full swing, with improved triggering, object reconstruction and analysis techniques.”

Di-Higgs production, the rate of which depends on the strength of the Higgs boson’s self-coupling, offers a direct probe of the shape of the Brout–Englert–Higgs potential and is a key target of the LHC Higgs programme. Multiple SM extensions predict measurable effects on the di-Higgs production rate. In addition to non-resonant searches in di-Higgs production, ATLAS and CMS are pursuing a number of searches for BSM resonances decaying into a pair of Higgs bosons, which were shown during the workshop.

Rich exchanges between experimentalists and theorists in an informal setting gave rise to several new lines of attack for physicists to explore further. Moreover, the critical role of the High-Luminosity LHC to probe the scalar sector of the SM at the TeV scale was made clear.

“Much discussed during this workshop was the concern that people in the field are becoming demotivated by the lack of discoveries at the LHC since the Higgs, and that we have to wait for a future collider to make the next advance,” says organiser Andreas Crivellin (University of Zurich). “Nothing could be further from the truth: the scalar sector is not only the least explored of the SM and the one with the greatest potential to conceal new phenomena, but one that the High-Luminosity LHC will enable us to explore in detail.”

Subtleties of quantum fields

Quantum field theory unites quantum physics with special relativity. It is the framework of the Standard Model (SM), which describes the electromagnetic, weak and strong interactions as gauge forces, mediated by photons, gluons and W and Z bosons, plus additional interactions mediated by the Higgs field. The success of the SM has exceeded all expectations, and its mathematical structure has led to a number of impressive predictions. These include the existence of the charm quark, discovered in 1974, and the existence of the Higgs boson, discovered in 2012.

Uncovering Quantum Field Theory and the Standard Model by Wolfgang Bietenholz of the National Autonomous University of Mexico and Uwe-Jens Wiese from the University of Bern, explains the foundations of quantum field theory in great depth, from classical field theory and canonical quantisation to regularisation and renormalisation, via path integrals and the renormalisation group. What really makes the book special are frequently discussed relations to statistical mechanics and condensed-matter physics.

Riding a wave

The section on particles and “wavicles” is highly original. In quantum field theory, quantised excitations of fields cannot be interpreted as point-like particles. Unlike massive particles in non-relativistic quantum mechanics, these excitations have non-trivial localisation properties, which apply to photons and electrons alike. To emphasise the difference between non-relativistic particles and wave excitations in a relativistic theory, one may refer to them as “wavicles”, following Frank Wilczek. As discussed in chapter 3, an intuitive understanding of wavicles can be gained by the analogy to phonons in a crystal. Another remarkable feature of charged fields is the infinite extension of their excitations due to their Coulomb field. This means that any charged state necessarily includes an infrared cloud of soft gauge bosons. As a result, they cannot be described by ordinary one-particle states and are referred to as “infra­particles”. Their properties, along with the related “superselection sectors,” are explained in the section on scalar quantum electrodynamics. 

Uncovering Quantum Field Theory and the Standard Model

The SM can be characterised as a non-abelian chiral gauge theory. Bietenholz and Wiese explain the various aspects of chirality in great detail. Anomalies in global and local symmetries are carefully discussed in the continuum as well as on a space–time lattice, based on the Ginsparg–Wilson relation and Lüscher’s lattice chiral symmetry. Confinement of quarks and gluons, the hadron spectrum, the parton model and hard processes, chiral perturbation theory and deconfinement at high temperatures uncover perturbative and non-perturbative aspects of quantum chromodynamics (QCD), the theory of strong interactions. Numerical simulations of strongly coupled lattice Yang–Mills theories are very demanding. During the past four decades, much progress has been made in turning lattice QCD into a quantitative reliable tool by controlling statistical and systematic uncertainties, which is clearly explained to the critical reader. The treatment of QCD is supplemented by an introduction to the electroweak theory covering the Higgs mechanism, electroweak symmetry breaking and flavour physics of quarks and leptons.

The number of quark colours, which is three in nature, plays a prominent role in this book. At the quantum level, gauge symmetries can fail due to anomalies, rendering a theory inconsistent. The SM is free of anomalies, but this only works because of a delicate interplay between quark and lepton charges and the number of colours. An important example of this interplay is the decay of the neutral pion into two photons. The subtleties of this process are explained in chapter 24.

The number of quark colours, which is three in nature, plays a prominent role in this book

Most remarkably, the SM predicts baryon-number-violating processes. This arises from the vacuum structure of the weak SU(2) gauge fields, which involves topologically distinct field configurations. Quantum tunnelling between them, together with the anomaly in the baryon–number current, leads to baryon–number violating transitions, as discussed in chapter 26. Similarly, in QCD a non-trivial topology of the gluon field leads to an explicit breaking of the flavour-singlet axial symmetry and, subsequently, to the mass of the η′ meson. Moreover, the gauge field topology gives rise to an additional parameter in QCD, the vacuum-angle θ. Since this parameter induces an electric dipole moment of the neutron that satisfies a strong upper bound, this confronts us with the strong-CP problem: what constrains θ to be so tiny that the experimental upper bound on the neutron dipole moment is satisfied? A solution may be provided by the Peccei–Quinn symmetry and axions, as discussed in a dedicated chapter.

By analogy with the QCD vacuum angle, one can introduce a CP-violating electromagnetic parameter θ into the SM – even though it has no physical effect in pure QED. This brings us to a gem of the book: its discussion of the Witten effect. In the presence of such a θ, the electric charge of a magnetic monopole becomes θ/2π plus an integer. This leads to the remarkable conclusion that for non-zero θ, all monopoles become dyons, carrying both electric and magnetic charge.

The SM is an effective low-energy theory and we do not know at what energy scale elements of a more fundamental theory will become visible. Its gauge structure and quark and lepton content hint at a possible unification of the interactions into a larger gauge group, which is discussed in the final chapter. Once gravity is included, one is confronted with a hierarchy problem: the question of why the electroweak scale is so small compared to the Planck mass, at which the Compton wavelength of a particle and its Schwarzschild radius coincide. Hence, at Planck energies quantum gravitational effects cannot be ignored. Perhaps, solving the electroweak hierarchy puzzle requires working with supersymmetric theories. For all students and scientists struggling with the SM and exploring possible extensions, the nine appendices will be a very valuable source of information for their research.

Einstein’s entanglement

Quantum entanglement is the quantum phenomenon par excellence. Our world is a quantum world: the matter that we see and touch is the most obvious consequence of quantum physics and it wouldn’t really exist the way it is in a purely classical world. However, in our modern parlance when we talk about quantum sensors or quantum computing, what makes these things “quantum” is the employment of entanglement. Entanglement was first discussed by Einstein and Schrödinger, and later became famous with the celebrated EPR (Einstein–Podolsky–Rosen) paper of 1935.

The magic of entanglement

In an entangled particle system, some properties have to be assigned to the system itself and not to individual particles. When a neutral pion decays into two photons, for example, conservation of angular momentum requires their total spin to be zero. Since the photons travel in opposite directions in the pion’s rest frame, in order for their spins to cancel they must share the same “helicity”. Helicity is the spin projection along the direction of motion, and only two states are possible: left- or right-handed. If one photon is measured to be left-handed, the other must be left-handed as well. The entangled photons must be thought of as a single quantum object: neither do the individual particles have predefined spins nor does the measurement performed on one cause the other to pick a spin orientation. Experiments in more complicated systems have ruled these possibilities out, at least in their simplest incarnations, and this is exactly where the magic of entanglement begins.

Quantum entanglement is the main topic of Einstein’s Entanglement by William Stuckey, Michael Silberstein and Timothy McDevitt, all currently teaching at Elizabethtown College, Pennsylvania. The trio have complementary expertise in physics, philosophy and maths, and this is not their first book on the foundations of physics. They aim to explain why entanglement is so puzzling to physicists and the various ways that have been employed over the years to explain (or even explain away) the phenomenon. They also want to introduce the readers to their own idea on how to solve the riddle and argue about its merits.

Why is entanglement so puzzling to physicists, and what has been employed to explain the phenomenon?

General readers may struggle in places. The book does have accessible chapters, for example one at the start with a quantum-gloves experiment – a nice way to introduce the reader to the problem – as well as a chapter on special relativity. Much of the discussion about quantum mechanics, however, uses advanced concepts such as Hilbert space and the Bloch sphere, that belong to an undergraduate course in quantum mechanics. Philosophical terminology, such as “wave-function realism”, is also used copiously. The explanations and the discussion provided are of good quality and an interested reader in the interpretations of quantum mechanics with some background in physics has a lot to gain. The authors quote copiously from a superb list of references and include many interesting historical facts that make reading the book very entertaining.

In general, the book criticises constructive approaches to interpreting quantum mechanics that explicitly postulate physical phenomena. In the example of neutral-pion decay that I gave previously, the case in which the measurement of one photon causes the other photon to pick a spin would require a constructive explanation. These can be contrasted with principle explanations, which may involve, for example, invoking an overarching symmetry. To quote an example that is used many times in the book, the relativity principle can be used to explain Lorentz length contraction without the need for a physical mechanism to contract the bodies, which would require a constructive explanation.

The authors make the claim that the conceptual issues with entanglement can be solved by sticking to principle explanations and, in particular, with the demand that Planck’s constant is measured to be the same in all inertial reference frames. Whether this simple suggestion is adequate to explain the mysteries of quantum mechanics, I will leave to the reader. Seneca wrote in his Natural Questions that “our descendants will be astonished at our ignorance of what to them is obvious”. If the authors are correct, entanglement may prove to be a case in point.

John Peoples 1933–2025

John Peoples

John Peoples, the third director of Fermilab, who guided the lab through one of the most critical periods in its history, passed away on 25 June 2025. Born in New York City on 22 January 1933, John received his bachelor’s degree in electrical engineering from the Carnegie Institute of Technology (now Carnegie Mellon University) in 1955. After several years at the Glen L. Martin Company, John entered Columbia University where he received his PhD in physics in 1966 for the measurement of the Michel parameter in muon decay under the direction of Allan Sachs. This was followed by a teaching and research position at Cornell University and relocation to Fermilab, initially on sabbatical, in 1971.

John officially joined the Fermilab staff in 1975 as head of the Research Division. His tenure included the discovery of the upsilon particle (b-quark bound state) by Leon Lederman’s team in 1977. He also held responsibilities for the upgrading of the experimental areas to accept beams of up to 1 TeV in anticipation of the completion of the Fermilab Tevatron.

In 1981, following Lederman’s decision to utilise the Tevatron as a proton–antiproton collider, John was appointed head of the TeV-I Project, with responsibility for the construction of the Antiproton Source and the collision hall for the CDF detector. Under John’s leadership, a novel design was developed, building on the earlier pioneering work done at CERN for antiproton accumulation based on stochastic cooling, and proton–antiproton collisions were achieved in the Tevatron four years later, in 1985.

Tireless commitment

John succeeded Lederman to become Fermilab’s third director in July 1989, shortly after the decision to locate the Superconducting Super Collider (SSC) in Waxahachie, Texas, creating immense challenges to Fermilab’s future. John guided the US community to a plan for a new accelerator, the Main Injector (and ultimately the Recycler), that could support a high-luminosity collider programme for the decade of SSC construction while simultaneously providing high-intensity extracted beams for a future neutrino programme that could sustain Fermilab well beyond the SSC’s startup. The cancellation of the SSC in 1993 was a seismic event for US and global high-energy physics, and ensured the Tevatron’s role as the highest energy collider in the world for the next almost two decades. John was asked to lead the termination phase of the SSC lab. In 1994/1995, as director of both Fermilab and the SSC, he worked on this painful task with a special emphasis on helping the many suddenly unemployed people find new career paths.

During John’s tenure as director, Fermilab produced many important physics results. In 1995, the Tevatron Collider experiments, CDF and D, announced the discovery of the top quark, the final quark predicted in the Standard Model of particle physics at the mass of more than 175 times that of the proton. To ensure that the experiments could analyse their data quickly and efficiently, John supported replacing costly mainframe computers with “clusters” of inexpensive microprocessors developed in industry for personal computers and later laptops and phones. The final fixed-target run with 800 GeV extracted beam in 1997 and 1998 helped resolve an important and long-standing problem in CP violation in kaon decays and discovered the tau neutrino.

His leadership both enhanced international collaboration and retained a prominent role for Fermilab in collider physics

From 1993–1997, John served as chair of the International Committee for Future Accelerators (ICFA). He stepped down after two terms as Fermilab director in 1999. In 2010, he received the Robert R. Wilson Prize for Achievement in the Physics of Particle Acceleration from the American Physical Society.

Under John’s influence, there were frequent personnel exchanges between Fermilab and CERN throughout the 1980s, as Fermilab staff benefited from CERN’s experience with antiproton production and CERN benefited from Fermilab’s experience with the operations of a superconducting accelerator. These exchanges extended into the 1990s, and following the termination of the SSC, John was instrumental in securing support for US participation in the LHC accelerator and detector projects. His leadership both enhanced international collaboration and retained a prominent role for Fermilab in collider physics after the Tevatron completed operations in 2011.

During the 1980s, astrophysics became an important contributor to our knowledge of particle physics and required more ambitious experiments with strong synergies with the latest round of HEP experiments. In 1991, John formed the Experimental Astrophysics Group at Fermilab. This led to its strong participation in the Sloan Digital Sky Survey (SDSS), the Pierre Auger Cosmic Ray Observatory, the Cryogenic Dark Matter Search (CDMS) and the Dark Energy Survey (DES), of which John became director in 2003. John’s vision of a vibrant community of particle physicists, astrophysicists and cosmologists exploring the inner space-outer space connection is now reality.

Those of us who had the privilege of knowing and working with John were challenged by his intense work ethic and by the equally intense flood of new ideas for running and improving our programmes. He was a gifted and dedicated experimental physicist, skilled in accelerator science, an expert in superconducting magnet design and technology, a superb manager, and a great recruiter and mentor of young engineers and scientists, including the authors of this article. We will miss him!

bright-rec iop pub iop-science physcis connect