Comsol -leaderboard other pages

Topics

Slovenia, Ireland and Chile tighten ties with CERN

Slovenia became CERN’s 25th Member State on 21 June, formalising a relationship of over 30 years. Full membership confers voting rights in the CERN Council and opportunities for Slovenian enterprises and citizens.

“Slovenia’s full membership in CERN is an exceptional recognition of our science and researchers,” said Igor Papič, Slovenia’s Minister of Higher Education, Science and Innovation. “Furthermore, it reaffirms and strengthens Slovenia’s reputation as a nation building its future on knowledge and science. Indeed, apart from its beautiful natural landscapes, knowledge is the only true natural wealth of our country. For this reason, we have allocated record financial resources to science, research and innovation. Moreover, we have enshrined the obligation to increase these funds annually in the Scientific Research and Innovation Activities Act.”

“On behalf of the CERN Council, I warmly welcome Slovenia as the newest Member State of CERN,” said Costas Fountas, president of the CERN Council. “Slovenia has a longstanding relationship with CERN, with continuous involvement of the Slovenian science community over many decades in the ATLAS experiment in particular.”

On 8 and 16 May, respectively, Ireland and Chile signed agreements to become Associate Member States of CERN, pending the completion of national ratification processes. They join Türkiye, Pakistan, Cyprus, Ukraine, India, Lithuania, Croatia, Latvia and Brazil as Associate Members – a status introduced by the CERN Council in 2010. In this period, the Organization has also concluded international cooperation agreements with Qatar, Sri Lanka, Nepal, Kazakhstan, the Philippines, Thailand, Paraguay, Bosnia and Herzegovina, Honduras, Bahrain and Uruguay.

Advances in very-high-energy astrophysics

Advances in Very High Energy Astrophysics: The Science Program of the Third Generation IACTs for Exploring Cosmic Gamma Rays

Imaging atmospheric Cherenkov telescopes (IACTs) are designed to detect very-high-energy gamma rays, enabling the study of a range of both galactic and extragalactic gamma-ray sources. By capturing Cherenkov light from gamma-ray-induced air showers, IACTs help trace the origins of cosmic rays and probe fundamental physics, including questions surrounding dark matter and Lorentz invariance. Since the first gamma-ray source detection by the Whipple telescope in 1989, the field has rapidly advanced through instruments like HESS, MAGIC and VERITAS. Building on these successes, the Cherenkov Telescope Array Observatory (CTAO) represents the next generation of IACTs, with greatly improved sensitivity and energy coverage. The northern CTAO site on La Palma is already collecting data, and major infrastructure development is now underway at the southern site in Chile, where telescope construction is set to begin soon.

Considering the looming start to CTAO telescope construction, Advances in Very High Energy Astrophysics, edited by Reshmi Mukherjee of Barnard College and Roberta Zanin, from the University of Barcelona, is very timely. World-leading experts tackle the almost impossible task of summarising the progress made by the third-generation IACTs: HESS, MAGIC and VERITAS.

The range of topics covered is vast, spanning the last 20 years of progress in the areas of IACT instrumentation, data-analysis techniques, all aspects of high-energy astrophysics, cosmic-ray astrophysics and gamma-ray cosmology.  The authors are necessarily selective, so the depth into each sector is limited, but I believe that the essential concepts were properly introduced and the most important highlights captured. The primary focus of the book lies in discussions surrounding gamma-ray astronomy and high-energy physics, cosmic rays and ongoing research into dark matter.

It appears, however, that the individual chapters were all written independently of each other by different authors, leading to some duplications. Source classes and high-energy radiation mechanisms are introduced multiple times, sometimes with different terminology and notation in the different chapters, which could lead to confusion for novices in the field. But though internal coordination could have been improved, a positive aspect of this independence is that each chapter is self-contained and can be read on its own. I recommend the book to emerging researchers looking for a broad overview of this rapidly evolving field.

Hadrons in Porto Alegre

The 16th International Workshop on Hadron Physics (Hadrons 2025) welcomed 135 physicists to the Federal University of Rio Grande do Sul (UFRGS) in Porto Alegre, Brazil. Delayed by four months due to a tragic flood that devastated the city, the triennial conference took place from 10 to 14 March, despite adversity maintaining its long tradition as a forum for collaboration among Brazilian and international researchers at different stages of their careers.

The workshop’s scientific programme included field theoretical approaches to QCD, the behaviour of hadronic and quark matter in astrophysical contexts, hadronic structure and decays, lattice QCD calculations, recent experimental developments in relativistic heavy-ion collisions, and the interplay of strong and electroweak forces within the Standard Model.

Fernanda Steffens (University of Bonn) explained how deep-inelastic-scattering experiments and theoretical developments are revealing the internal structure of the proton. Kenji Fukushima (University of Tokyo) addressed the theoretical framework and phase structure of strongly interacting matter, with particular emphasis on the QCD phase diagram and its relevance to heavy-ion collisions and neutron stars. Chun Shen (Wayne State University) presented a comprehensive overview of the state-of-the-art techniques used to extract the transport properties of quark–gluon plasma from heavy-ion collision data, emphasising the role of Bayesian inference and machine learning in constraining theoretical models. Li-Sheng Geng (Beihang University) explored exotic hadrons through the lens of hadronic molecules, highlighting symmetry multiplets such as pentaquarks, the formation of multi-hadron states and the role of femtoscopy in studying unstable particle interactions.

This edition of Hadrons was dedicated to the memory of two individuals who left a profound mark on the Brazilian hadronic-physics community: Yogiro Hama, a distinguished senior researcher and educator whose decades-long contributions were foundational to the development of the field in Brazil, and Kau Marquez, an early-career physicist whose passion for science remained steadfast despite her courageous battle with spinal muscular atrophy. Both were remembered with deep admiration and respect, not only for their scientific dedication but also for their personal strength and impact on the community.

Its mission is to cultivate a vibrant and inclusive scientific environment

Since its creation in 1988, the Hadrons workshop has played a central role in developing Brazil’s scientific capacity in particle and nuclear physics. Its structure facilitates close interaction between master’s and doctoral students, and senior researchers, thus enhancing both technical training and academic exchange. This model continues to strengthen the foundations of research and collaboration throughout the Brazilian scientific community.

This is the main event for the Brazilian particle- and nuclear-physics communities, reflecting a commitment to advancing research in this highly interactive field. By circulating the venue across multiple regions of Brazil, each edition further renews its mission to cultivate a vibrant and inclusive scientific environment. This edition was closed by a public lecture on QCD by Tereza Mendes (University of São Paolo), who engaged local students with the foundational questions of strong-interaction physics.

The next edition of the Hadrons series will take place in Bahia in 2028.

Muons under the microscope in Cincinnati

The 23rd edition of Flavor Physics and CP Violation (FPCP) attracted 100 physicists to Cincinnati, USA, from 2 to 6 June 2025. The conference reviews recent experimental and theoretical developments in CP violation, rare decays, Cabibbo–Kobayashi–Maskawa matrix elements, heavy-quark decays, flavour phenomena in charged leptons and neutrinos, and the interplay between flavour physics and high-pT physics at the LHC.

The highlight of the conference was new results on the muon magnetic anomaly. The Muon g-2 experiment at Fermilab released its final measurement of aμ = (g-2)/2 on 3 June, while the conference was in progress, reaching a precision of 127 ppb on the published value. This uncertainty is more than four times smaller than that reported by the previous experiment. One week earlier, on 27 May, the Muon g-2 Theory Initiative published their second calculation of the same quantity, following that published in summer 2020. A major difference between the two calculations is that the earlier one used experimental data and the dispersion integral to evaluate the hadronic contribution to aμ, whereas the update uses a purely theoretical approach based on lattice QCD. The strong tension with the experiment of the earlier calculation is no longer present, with the new calculation compatible with experimental results. Thus, no new physics discovery can be claimed, though the reason for the difference between the two approaches must be understood (see “Fermilab’s final word on muon g-2“). 

The MEG II collaboration presented an important update to their limit on the branching fraction for the lepton-flavour-violating decay μ → eγ. Their new upper bound of 1.5 × 10–13 is determined from data collected in 2021 and 2022. The experiment recorded additional data from 2023 to 2024 and expects to continue data taking for two more years. These data will be sensitive to a branching fraction four to five times smaller than the current limit.

LHCb, Belle II, BESIII and NA62 all discussed recent results in quark flavour physics. Highlights include the first measurement of CP violation in a baryon decay by LHCb and improved limits on CP violation in D-meson decay to two pions by Belle II. With more data, the latter measurements could potentially show that the observed CP violation in charm is from a non-Standard-Model source. 

The Belle II collaboration now plans to collect a sample between 5 to 10 ab–1 by the early 2030s before undergoing an upgrade to collect a 30 to 50 ab–1 sample by the early 2040s. LHCb plan to run to the end of the High-Luminosity LHC and collect 300 fb–1. LHCb recorded almost 10 fb–1 of data last year – more than in all their previous running, and now with a fully software-based trigger with much higher efficiency than the previous hardware-based first-level trigger. Future results from Belle II and the LHCb upgrade are eagerly anticipated.

The 24th FPCP conference will be held from 18 to 22 May 2026 in Bad Honnef, Germany. 

A new phase for the FCC

FCC Week 2025 gathered more than 600 participants from 34 countries together in Vienna from 19 to 23 May. The meeting was the first following the submission of the FCC’s feasibility study to the European Strategy for Particle Physics (CERN Courier May/June 2025 p9). Comprising three volumes – covering physics and detectors, accelerators and infrastructure, and civil engineering and sustainability – the study represents the most comprehensive blueprint to date for a next-generation collider facility. The next phase will focus on preparing a robust implementation strategy, via technical design, cost assessment, environmental planning and global engagement.

CERN Director-General Fabiola Gianotti estimated the integral FCC programme to offer unparalleled opportunities to explore physics at the shortest distances, and noted growing support and enthusiasm for the programme within the community. That enthusiasm is reflected in the growing collaboration: the FCC collaboration now includes 162 institutes from 38 countries, with 28 new Memoranda of Understanding signed in the past year. These include new partnerships in Latin America, Asia and Ukraine, as well as Statements of Intent from the US and Canada. The FCC vision has also gained visibility in high-level policy dialogues, including the Draghi report on European competitiveness. Scientific plenaries and parallel sessions highlighted updates on simulation tools, rare-process searches and strategies to probe beyond the Standard Model. Detector R&D has progressed significantly, with prototyping, software development and AI-driven simulations advancing rapidly.

In accelerator design, developments included updated lattice and optics concepts involving global “head-on” compensation (using opposing beam interactions) and local chromaticity corrections (to the dependence of beam optics on particle energy). Refinements were also presented to injection schemes, beam collimation and the mitigation of collective effects. A central tool in these efforts is the Xsuite simulation platform, whose capabilities now include spin tracking and modelling based on real collider environments such as SuperKEKB.

Technical innovations also came to the fore. The superconducting RF system for FCC-ee includes 400 MHz Nb/Cu cavities for low-energy operation and 800 MHz Nb cavities for higher-energy modes. The introduction of reverse-phase operation and new RF source concepts – such as the tristron, with energy efficiencies above 90% (CERN Courier May/June 2025 p30) – represent major design advances.

Design developments

Vacuum technologies based on ultrathin NEG coating and discrete photon stops, as well as industrialisation strategies for cost control, are under active development. For FCC-hh, high-field magnet R&D continues on both Nb3Sn prototypes and high-temperature superconductors.

Sessions on technical infrastructure explored everything from grid design, cryogenics and RF power to heat recovery, robotics and safety systems. Sustainability concepts, including renewable energy integration and hydrogen storage, showcased the project’s interdisciplinary scope and long-term environmental planning.

FCC Week 2025 extended well beyond the conference venue, turning Vienna into a vibrant hub for public science outreach

The Early Career Researchers forum drew nearly 100 participants for discussions on sustainability, governance and societal impact. The session culminated in a commitment to inclusive collaboration, echoed by the quote from Austrian-born artist, architect and environmentalist Friedensreich Hundertwasser (1928–2000): “Those who do not honour the past lose the future. Those who destroy their roots cannot grow.”

This spirit of openness and public connection also defined the week’s city-wide engagement. FCC Week 2025 extended well beyond the conference venue, turning Vienna into a vibrant hub for public science outreach. In particular, the “Big Science, Big Impact” session – co-organised with the Austrian Federal Economic Chamber (WKO) – highlighted CERN’s broader role in economic development. Daniel Pawel Zawarczynski (WKO) shared examples of small and medium enterprise growth and technology transfer, noting that CERN participation can open new markets, from tunnelling to aerospace. Economist Gabriel Felbermayr referred to a recent WIFO analysis indicating a benefit-to-cost ratio for the FCC greater than 1.2 under conservative assumptions. The FCC is not only a tool for discovery, observed Johannes Gutleber (CERN), but also a platform enabling technology development, open software innovation and workforce training.

The FCC awards celebrate the creativity, rigour and passion that early-career researchers bring to the programme. This year, Tsz Hong Kwok (University of Zürich) and Audrey Piccini (CERN) won poster prizes, Sara Aumiller (TU München) and Elaf Musa (DESY) received innovation awards, and Ivan Karpov (CERN) and Nicolas Vallis (PSI) were honoured with paper prizes sponsored by Physical Review Accelerators and Beams. As CERN Council President Costas Fountas reminded participants, the FCC is not only about pushing the frontiers of knowledge, but also about enabling a new generation of ideas, collaborations and societal progress.

Mary K Gaillard 1939–2025

Mary K Gaillard, a key figure in the development of the Standard Model of particle physics, passed away on 23 May 2025. She was born in 1939 to a family of academics who encouraged her inquisitiveness and independence. She graduated in 1960 from Hollins College, a small college in Virginia, where her physics professor recognised her talent, helping her get jobs in the Ringuet laboratory at l’École Polytechnique during a junior year abroad and for two summers at the Brookhaven National Laboratory. In 1961 she obtained a master’s degree from Columbia University and in 1968 a doctorate in theoretical physics from the University of Paris at Orsay. Mary K was a research scientist with the French CNRS and a visiting scientist at CERN for most of the 1970s. From 1981 until she retired in 2009, she was a senior scientist at the Lawrence Berkeley National Laboratory and a professor of physics at the University of California at Berkeley, where she was the first woman in the department.

Mary K was a theoretical physicist of great power, gifted both with a deep physical intuition and a very high level of technical mastery. She used her gifts to great effect and made many important contributions to the development of the Standard Model of elementary particle physics that was established precisely during the course of her career. She pursued her love of physics with powerful determination, in the face of overt discrimination that went well beyond what may still exist today. She fought these battles and produced beautiful, important physics, all while raising three children as a devoted mother.

Undeniable impact

After obtaining her master’s degree at Columbia, Mary K accompanied her first husband, Jean-Marc Gaillard, to Paris, where she was rebuffed in many attempts to obtain a position in an experimental group. She next tried and failed, multiple times, to find an advisor in theoretical physics, which she actually preferred to experimental physics but had not pursued because it was regarded as an even more unlikely career for a woman. Eventually, and fortunately for the development of elementary particle physics, Bernard d’Espagnat agreed to supervise her doctoral research at the University of Paris. While she quickly succeeded in producing significant results in her research, respect and recognition were still slow to come. She suffered many slights from a culture that could not understand or countenance the possibility of a woman theoretical physicist and put many obstacles in her way. Respect and recognition did finally come in appropriate measure, however, by virtue of the undeniable impact of her work.

Her contributions to the field are numerous. During an intensely productive period in the mid-1970s, she completed a series of projects that established the framework for the decades to follow that would culminate in the Standard Model. Famously, during a one-year visit to Fermilab in 1973, using the known properties of the “strange” K mesons, she successfully predicted the mass scale of the fourth “charm” quark a few months prior to its discovery. Back at CERN a few years later, she also predicted, in the framework of grand unified theories, the mass of the fifth “bottom” quark – a successful though still speculative prediction. Other impactful work, extracting the experimental consequences of theoretical constructs, laid down the paths that were followed to experimentally validate the charm-quark discovery and to search for the Higgs boson required to complete the Standard Model. Another key contribution showed how “jets”, streams of particles created in high-energy accelerators, could be identified as manifestations of the “gluon” carriers of the strong force of the Standard Model.

In the 1980s in Berkeley, when the Superconducting Super Collider and the Large Hadron Collider were under discussion, she showed that they could successfully uncover the mechanism of electroweak symmetry breaking required to understand the Standard Model weak force, even if it was “dynamical” – an experimentally much more challenging possibility than breaking by a Higgs boson. For the remainder of her career, she focused principally on work to address issues that are still unresolved by the Standard Model. Much of this research involved “supersymmetry” and its extension to encompass the gravitational force, theoretical constructs that originated in the work of her second husband, the late Bruno Zumino, who also moved from CERN to Berkeley.

Mary K’s accomplishments were recognised by numerous honorary societies and awards, including the National Academy of Sciences, the American Academy of Arts and Sciences, and the J. J. Sakurai Prize for Theoretical Particle Physics of the American Physical Society. She served on numerous governmental and academic advisory panels, including six years on the National Science Board. She tells her own story in a memoir, A Singularly Unfeminine Profession, published in 2015. Mary K Gaillard will surely be remembered when the final history of elementary particle physics is written.

Fritz Caspers 1950–2025

Friedhelm “Fritz” Caspers, a master of beam cooling, passed away on 12 March 2025.

Born in Bonn, Germany in 1950, Fritz studied electrical engineering at RWTH Aachen. He joined CERN in 1981, first as a fellow and then as a staff member. During the 1980s Fritz contributed to stochastic cooling in CERN’s antiproton programme. In the team of Georges Carron and Lars Thorndahl, he helped devise ultra-fast microwave stochastic cooling systems for the then new antiproton cooler ring. He also initiated the development of power field-effect transistors that are still operational today in CERN’s Antiproton Decelerator ring. Fritz conceived novel geometries for pickups and kickers, such as slits cut into ground plates, as now used for the GSI FAIR project, and meander-type electrodes. From 1988 to 1995, Fritz was responsible for all 26 stochastic-cooling systems at CERN. In 1990 he became a senior member of the Institute of Electrical and Electronics Engineers (IEEE), before being distinguished as an IEEE Life Fellow later in his career.

Pioneering diagnostics

In the mid-2000s, Fritz proposed enamel-based clearing electrodes and initiated pertinent collaborations with several German companies. At about the same time, he carried out ultrasound diagnostics on soldered junctions on LHC interconnects. Among the roughly 1000 junctions measured, he and his team found a single non-conform junction. In 2008 Fritz suggested non-elliptical superconducting crab cavities for the HL-LHC. He also proposed and performed pioneering electron-cloud diagnostics and mitigation-using microwaves. For the LHC, he predicted a “magnetron effect”, where coherently radiating cloud electrons might quench the LHC magnets at specific values of their magnetic field. His advice was highly sought after on laboratory-impedance measurements and electromagnetic interference.

Throughout the past three decades, Fritz was active and held in high esteem not only at CERN but all around the world. For example, he helped develop the stochastic cooling systems for GSI in Darmstadt, Germany, where his main contact was Fritz Nolden. He contributed to the construction and commissioning of stochastic cooling for GSI’s Experimental Storage Ring, including the successful demonstration of the stochastic cooling of heavy ions in 1997. Fritz also helped develop the stochastic cooling of rare isotopes for the RI Beam Factory project at RIKEN, Japan.

He helped develop the power field-effect transistors still operational today in CERNs AD ring

Fritz was a long-term collaborator of IMP Lanzhou at the Chinese Academy of Sciences (CAS). In 2015, stochastic cooling was commissioned at the Cooling Storage Ring with his support. Always kind and willing to help anyone who needed him, Fritz also provided valuable suggestions and hands-on experience with impedance measurements for IMP’s HIAF project, especially the titanium-alloy-loaded thin-wall vacuum chamber and magnetic-alloy-loaded RF cavities. In 2021, Fritz was elected as a Distinguished Scientist of the CAS President’s International Fellowship Initiative and awarded the Dieter Möhl Award by the International Committee for Future Accelerators for his contributions to beam cooling.

In 2013, the axion dark-matter research centre IBS-CAPP was established at KAIST, Korea. For this new institute, Fritz proved to be just the right lecturer. Every spring, he visited Korea for a week of intensive lectures on RF techniques, noise measurements and much more. His lessons, which were open to scientists from all over Korea, transformed Korean researchers from RF amateurs into professionals, and his contributions helped propel IBS–CAPP to the forefront of research.

Fritz was far more than just a brilliant scientist. He was a generous mentor, a trusted colleague and a dear friend who lit up a room when he entered, and his absence will be deeply felt by all of us who had the privilege of knowing him. Always on the hunt for novel ideas, Fritz was a polymath and a fully open-minded scientist. His library at home was a visit into the unknown, containing “dark matter”, as we often joked. We will remember Fritz as a gentleman who was full of inspiration for the young and the not-so-young alike. His death is a loss to the whole accelerator world.

Sandy Donnachie 1936–2025

Sandy Donnachie, a particle theorist and scientific leader, passed away on 7 April 2025.

Born in 1936 and raised in Kilmarnock, Scotland, Sandy received his BSc and PhD degrees from the University of Glasgow before taking up a lectureship at University College London in 1963. He was a CERN research associate from 1965 to 1967, and then senior lecturer at the University of Glasgow until 1969, when he took up a chair at the University of Manchester and played a leading role in developing the scientific programme at NINA, the electron synchrotron at the nearby Daresbury National Laboratory. Sandy then served as head of the Department of Physics and Astronomy at the University from 1989 to 1994, and as dean of the Faculty of Science and Engineering from 1994 to 1997. He had a formidable reputation – if a staff member or student asked to see him, he would invite them to come at 8 a.m., to test whether what they wanted to discuss was truly important.

Sandy played a leading role in the international scientific community, maintaining strong connections with CERN throughout his career, as scientific delegate to the CERN Council from 1989 to 1994, chair of the SPS committee from 1988 to 1992, and member of the CERN Scientific Policy Committee from 1988 to 1993. In the UK, he chaired the UK’s Nuclear Physics Board from 1989 to 1993, and served as a member of the Science and Engineering Research Council from 1989 to 1994. He also served as an associate editor for Physical Review Letters from 2010 to 2016. In recognition of his leadership and scientific contributions, he was awarded the UK’s Institute of Physics Glazebrook Medal in 1997.

The “Donnachie–Landshoff pomeron” is known to all those working in the field

Sandy is perhaps best known for his body of work with Peter Landshoff on elastic and diffractive scattering: the “Donnachie–Landshoff pomeron” is known to all those
working in the field. The collaboration began half a century ago and when email became available, they were among its early and most enthusiastic users. Sandy only knew Fortran and Peter only knew C, but somehow they managed to collaborate and together wrote more than 50 publications, including a book Pomeron Physics and QCD with Günter Dosch and Otto Nachtmann published in 2004. The collaboration lasted until, so sadly, Sandy was struck with Parkinson’s disease and was no longer able to use email. Earlier in his career, Sandy had made significant contributions to the field of low-energy hadron scattering, in particular through a collaboration with Claud Lovelace, which revealed many hitherto unknown baryon states in pion–nucleon scattering, and through a series of papers on meson photoproduction, initially with Graham Shaw and then with Frits Berends and other co-workers.

Throughout his career, Sandy was notable for his close collaborations with experimental physics groups, including a long association with the Omega Photon Collaboration at CERN, with whom he co-authored 27 published papers. He and Shaw also produced three books, culminating in Electromagnetic Interactions and Hadronic Structure with Frank Close, which was published in 2007.

In his leisure time, Sandy was a great lover of classical music and a keen sailor, golfer and country walker.

Fritz A Ferger 1933–2025

Fritz Ferger, a multi-talented engineer who had a significant impact on the technical development and management of CERN, passed away on 22 March 2025.

Born in Reutlingen, Germany, on 5 April 1933, Fritz obtained his electrical engineering degree in Stuttgart and a doctorate at the University of Grenoble. A contract with General Electric in his pocket, he visited CERN, curious about the 25 GeV Proton Synchrotron, the construction of which was receiving the finishing touches in the late 1950s. He met senior CERN staff and was offered a contract that he, impressed by the visit, accepted in early 1959.

Fritz’s first assignment was the development of a radio-frequency (RF) accelerating cavity for a planned fixed-field alternating-gradient (FFAG) accelerator. This was abandoned in early 1960 in favour of the study of a 2 × 25 GeV proton–proton collider, the Intersecting Storage Rings (ISR). As a first step, the CERN Electron Storage and Accumulation Ring (CESAR) was constructed to test high-vacuum technology and RF accumulation schemes; Fritz designed and constructed the RF system. With CESAR in operation, he moved on to the construction and tests of the high-power RF system of the ISR, a project that was approved in 1965.

After the smooth running-in of the ISR and, for a while having been responsible for the General Engineering Group, he became division leader of the ISR in 1974, a position he held until 1982. Under his leadership the ISR unfolded its full potential with proton beam currents up to 50 A and a luminosity 35 times the design value, leading CERN to acquire the confidence that colliders were the way to go. Due to his foresight, the development of new technologies was encouraged for the accelerator, including superconducting quadrupoles and pumping by cryo- and getter surfaces. Both were applied on a grand scale in LEP and are still essential for the LHC today.

Under his ISR leadership CERN acquired the confidence that colliders were the way to go

When the resources of the ISR Division were refocussed on LEP in 1983, Fritz became the leader of the Technical Inspection and Safety Commission. This absorbed the activities of the previous health and safety groups, but its main task was to scrutinise the LEP project from all technical and safety aspects. Fritz’s responsibility widened considerably when he became leader of the Technical Support Division in 1986. All of the CERN civil engineering, the tunnelling for the 27 km circumference LEP ring, its auxiliary tunnels, the concreting of the enormous caverns for the experiments and the construction of a dozen surface buildings were in full swing and brought to a successful conclusion in the following years. New buildings on the Meyrin site were added, including the attractive Building 40 for the large experimental groups, in which he took particular pride. At the same time, and under pressure to reduce expenditure, he had to manage several difficult outsourcing contracts.

When he retired in 1997, he could look back on almost 40 years dedicated to CERN; his scientific and technical competence paired with exceptional organisational and administrative talent. We shall always remember him as an exacting colleague with a wide range of interests, and as a friend, appreciated for his open and helpful attitude.

We grieve his loss and offer our sincere condolences to his widow Catherine and their daughters Sophie and Karina.

The minimalism of many worlds

Physicists have long been suspicious of the “quantum measurement problem”: the supposed puzzle of how to make sense of quantum mechanics. Everyone agrees (don’t they?) on the formalism of quantum mechanics (QM); any additional discussion of the interpretation of that formalism can seem like empty words. And Hugh Everett III’s infamous “many-worlds interpretation” looks more dubious than most: not just unneeded words but unneeded worlds. Don’t waste your time on words or worlds; shut up and calculate.

But the measurement problem has driven more than philosophy. Questions of how to understand QM have always been entangled, so to speak, with questions of how to apply and use it, and even how to formulate it; the continued controversies about the measurement problem are also continuing controversies in how to apply, teach and mathematically describe QM. The Everett interpretation emerges as the natural reading of one strategy for doing QM, which I call the “decoherent view” and which has largely supplanted the rival “lab view”, and so – I will argue – the Everett interpretation can and should be understood not as a useless adjunct to modern QM but as part of the development in our understanding of QM over the past century.

The view from the lab

The lab view has its origins in the work of Bohr and Heisenberg, and it takes the word “observable” that appears in every QM textbook seriously. In the lab view, QM is not a theory like Newton’s or Einstein’s that aims at an objective description of an external world subject to its own dynamics; rather, it is essentially, irreducibly, a theory of observation and measurement. Quantum states, in the lab view, do not represent objective features of a system in the way that (say) points in classical phase space do: they represent the experimentalist’s partial knowledge of that system. The process of measurement is not something to describe within QM: ultimately it is external to QM. And the so-called “collapse” of quantum states upon measurement represents not a mysterious stochastic process but simply the updating of our knowledge upon gaining more information.

Valued measurements

The lab view has led to important physics. In particular, the “positive operator valued measure” idea, central to many aspects of quantum information, emerges most naturally from the lab view. So do the many extensions, total and partial, to QM of concepts initially from the classical theory of probability and information. Indeed, in quantum information more generally it is arguably the dominant approach. Yet outside that context, it faces severe difficulties. Most notably: if quantum mechanics describes not physical systems in themselves but some calculus of measurement results, if a quantum system can be described only relative to an experimental context, what theory describes those measurement results and experimental contexts themselves?

Dynamical probes

One popular answer – at least in quantum information – is that measurement is primitive: no dynamical theory is required to account for what measurement is, and the idea that we should describe measurement in dynamical terms is just another Newtonian prejudice. (The “QBist” approach to QM fairly unapologetically takes this line.)

One can criticise this answer on philosophical grounds, but more pressingly: that just isn’t how measurement is actually done in the lab. Experimental kit isn’t found scattered across the desert (each device perhaps stamped by the gods with the self-adjoint operator it measures); it is built using physical principles (see “Dynamical probes” figure). The fact that the LHC measures the momentum and particle spectra of various decay processes, for instance, is something established through vast amounts of scientific analysis, not something simply posited. We need an account of experimental practice that allows us to explain how measurement devices work and how to build them.

Perhaps this was viable in the 1930s, but today measurement devices rely on quantum principles

Bohr had such an account: quantum measurements are to be described through classical mechanics. The classical is ineliminable from QM precisely because it is to classical mechanics we turn when we want to describe the experimental context of a quantum system. To Bohr, the quantum–classical transition is a conceptual and philosophical matter as much as a technical one, and classical ideas are unavoidably required to make sense of any quantum description.

Perhaps this was viable in the 1930s. But today it is not only the measured systems but the measurement devices themselves that essentially rely on quantum principles, beyond anything that classical mechanics can describe. And so, whatever the philosophical strengths and weaknesses of this approach – or of the lab view in general – we need something more to make sense of modern QM, something that lets us apply QM itself to the measurement process.

Practice makes perfect

We can look to physics practice to see how. As von Neumann glimpsed, and Everett first showed clearly, nothing prevents us from modelling a measurement device itself inside unitary quantum mechanics. When we do so, we find that the measured system becomes entangled with the device, so that (for instance) if a measured atom is in a weighted superposition of spins with respect to some axis, after measurement then the device is in a similarly-weighted superposition of readout values.

Origins

In principle, this courts infinite regress: how is that new superposition to be interpreted, save by a still-larger measurement device? In practice, we simply treat the mod-squared amplitudes of the various readout values as probabilities, and compare them with observed frequencies. This sounds a bit like the lab view, but there is a subtle difference: these probabilities are understood not with respect to some hypothetical measurement, but as the actual probabilities of the system being in a given state.

Of course, if we could always understand mod-squared amplitudes that way, there would be no measurement problem! But interference precludes this. Set up, say, a Mach–Zehnder interferometer, with a particle beam split in two and then re-interfered, and two detectors after the re-interference (see “Superpositions are not probabilities” figure). We know that if either of the two paths is blocked, so that any particle detected must have gone along the other path, then each of the two outcomes is equally likely: for each particle sent through, detector A fires with 50% probability and detector B with 50% probability. So whichever path the particle went down, we get A with 50% probability and B with 50% probability. And yet we know that if the interferometer is properly tuned and both paths are open, we can get A with 100% probability or 0% probability or anything in between. Whatever microscopic superpositions are, they are not straightforwardly probabilities of classical goings-on.

Unfeasible interference

But macroscopic superpositions are another matter. There, interference is unfeasible (good luck reinterfering the two states of Schrödinger’s cat); nothing formally prevents us from treating mod-squared amplitudes like probabilities.

And decoherence theory has given us a clear understanding of just why interference is invisible in large systems, and more generally when we can and cannot get away with treating mod-squared amplitudes as probabilities. As the work of Zeh, Zurek, Gell-Mann, Hartle and many others (drawing inspiration from Everett and from work on the quantum/classical transition as far back as Mott) has shown, decoherence – that is, the suppression of interference – is simply an aspect of non-equilibrium statistical mechanics. The large-scale, collective degrees of freedom of a quantum system, be it the needle on a measurement device or the centre-of-mass of a dust mote, are constantly interacting with a much larger number of small-scale degrees of freedom: the short-wavelength phonons inside the object itself; the ambient light; the microwave background radiation. We can still find autonomous dynamics for the collective degrees of freedom, but because of the constant transfer of information to the small scale, the coherence of any macroscopic superposition rapidly bleeds into microscopic degrees of freedom, where it is dynamically inert and in practice unmeasurable.

Emergence and scale

Decoherence can be understood in the familiar language of emergence and scale separation. Quantum states are not fundamentally probabilistic, but they are emergently probabilistic. That emergence occurs because for macroscopic systems, the timescale by which energy is transferred from macroscopic to residual degrees of freedom is very long compared to the timescale of the macroscopic system’s own dynamics, which in turn is very long compared to the timescale by which information is transferred. (To take an extreme example, information about the location of the planet Jupiter is recorded very rapidly in the particles of the solar wind, or even the photons of the cosmic background radiation, but Jupiter loses only an infinitesimal fraction of its energy to either.) So the system decoheres very rapidly, but having done so it can still be treated as autonomous.

On this decoherent view of QM, there is ultimately only the unitary dynamics of closed systems; everything else is a limiting or special case. Probability and classicality emerge through dynamical processes that can be understood through known techniques of physics: understanding that emergence may be technically challenging but poses no problem of principle. And this means that the decoherent view can address the lab view’s deficiencies: it can analyse the measurement process quantum mechanically; it can apply quantum mechanics even in cosmological contexts where the “measurement” paradigm breaks down; it can even recover the lab view within itself as a limited special case. And so it is the decoherent view, not the lab view, that – I claim – underlies the way quantum theory is for the most part used in the 21st century, including in its applications in particle physics and cosmology (see “Two views of quantum mechanics” table).

Two views of quantum mechanics

Quantum phenomenon Lab view Decoherent view

Dynamics

Unitary (i.e. governed by the Schrödinger equation) only between measurements

Always unitary

Quantum/classical transition

Conceptual jump between fundamentally different systems

Purely dynamical: classical physics is a limiting case of quantum physics

Measurements

Cannot be treated internal to the formalism

Just one more dynamical interaction

Role of the observer

Conceptually central

Just one more physical system

But if the decoherent view is correct, then at the fundamental level there is neither probability nor wavefunction collapse; nor is there a fundamental difference between a microscopic superposition like those in interference experiments and a macroscopic superposition like Schrödinger’s cat. The differences are differences of degree and scale: at the microscopic level, interference is manifest; as we move to larger and more complex systems it hides away more and more effectively; in practice it is invisible for macroscopic systems. But even if we cannot detect the coherence of the superposition of a live and dead cat, it does not thereby vanish. And so according to the decoherent view, the cat is simultaneously alive and dead in the same way that the superposed atom is simultaneously in two places. We don’t need a change in the dynamics of the theory, or even a reinterpretation of the theory, to explain why we don’t see the cat as alive and dead at once: decoherence has already explained it. There is a “live cat” branch of the quantum state, entangled with its surroundings to an ever-increasing degree; there is likewise a “dead cat” branch; the interference between them is rendered negligible by all that entanglement.

Many worlds

At last we come to the “many worlds” interpretation: for when we observe the cat ourselves, we too enter a superposition of seeing a live and a dead cat. But these “worlds” are not added to QM as exotic new ontology: they are discovered, as emergent features of collective degrees of freedom, simply by working out how to use QM in contexts beyond the lab view and then thinking clearly about its content. The Everett interpretation – the many-worlds theory – is just the decoherent view taken fully seriously. Interference explains why superpositions cannot be understood simply as parameterising our ignorance; unitarity explains how we end up in superpositions ourselves; decoherence explains why we have no awareness of it.

Superpositions are not probabilities

(Forty-five years ago, David Deutsch suggested testing the Everett interpretation by simulating an observer inside a quantum computer, so that we could recohere them after they made a measurement. Then, it was science fiction; in this era of rapid progress on AI and quantum computation, perhaps less so!)

Could we retain the decoherent view and yet avoid any commitment to “worlds”? Yes, but only in the same sense that we could retain general relativity and yet refuse to commit to what lies behind the cosmological event horizon: the theory gives a perfectly good account of the other Everett worlds, and the matter beyond the horizon, but perhaps epistemic caution might lead us not to overcommit. But even so, the content of QM includes the other worlds, just as the content of general relativity includes beyond-horizon physics, and we will only confuse ourselves if we avoid even talking about that content. (Thus Hawking, who famously observed that when he heard about Schrödinger’s cat he reached for his gun, was nonetheless happy to talk about Everettian branches when doing quantum cosmology.)

Alternative views

Could there be a different way to make sense of the decoherent view? Never say never; but the many-worlds perspective results almost automatically from simply taking that view as a literal description of quantum systems and how they evolve, so any alternative would have to be philosophically subtle, taking a different and less literal reading of QM. (Perhaps relationalism, discussed in this issue by Carlo Rovelli, see “Four ways to interpret quantum mechanics“, offers a way to do it, though in many ways it seems more a version of the lab view. The physical collapse and hidden variables interpretations modify the formalism, and so fall outside either category.)

The Everett interpretation is just the decoherent view taken fully seriously

Does the apparent absurdity, or the ontological extravagance, of the Everett interpretation force us, as good scientists, to abandon many-worlds, or if necessary the decoherent view itself? Only if we accept some scientific principle that throws out theories that are too strange or that postulate too large a universe. But physics accepts no such principle, as modern cosmology makes clear.

Are there philosophical problems for the Everett interpretation? Certainly: how are we to think of the emergent ontology of worlds and branches; how are we to understand probability when all outcomes occur? But problems of this kind arise across all physical theories. Probability is philosophically contested even apart from Everett, for instance: is it frequency, rational credence, symmetry or something else? In any case, these problems pose no barrier to the use of Everettian ideas in physics.

The case for the Everett interpretation is that it is the conservative, literal reading of the version of quantum mechanics we actually use in modern physics, and there is no scientific pressure for us to abandon that reading. We could, of course, look for alternatives. Who knows what we might find? Or we could shut up and calculate – within the Everett interpretation.

bright-rec iop pub iop-science physcis connect