Quantum field theory unites quantum physics with special relativity. It is the framework of the Standard Model (SM), which describes the electromagnetic, weak and strong interactions as gauge forces, mediated by photons, gluons and W and Z bosons, plus additional interactions mediated by the Higgs field. The success of the SM has exceeded all expectations, and its mathematical structure has led to a number of impressive predictions. These include the existence of the charm quark, discovered in 1974, and the existence of the Higgs boson, discovered in 2012.
Uncovering Quantum Field Theory and the Standard Model by Wolfgang Bietenholz of the National Autonomous University of Mexico and Uwe-Jens Wiese from the University of Bern, explains the foundations of quantum field theory in great depth, from classical field theory and canonical quantisation to regularisation and renormalisation, via path integrals and the renormalisation group. What really makes the book special are frequently discussed relations to statistical mechanics and condensed-matter physics.
Riding a wave
The section on particles and “wavicles” is highly original. In quantum field theory, quantised excitations of fields cannot be interpreted as point-like particles. Unlike massive particles in non-relativistic quantum mechanics, these excitations have non-trivial localisation properties, which apply to photons and electrons alike. To emphasise the difference between non-relativistic particles and wave excitations in a relativistic theory, one may refer to them as “wavicles”, following Frank Wilczek. As discussed in chapter 3, an intuitive understanding of wavicles can be gained by the analogy to phonons in a crystal. Another remarkable feature of charged fields is the infinite extension of their excitations due to their Coulomb field. This means that any charged state necessarily includes an infrared cloud of soft gauge bosons. As a result, they cannot be described by ordinary one-particle states and are referred to as “infraparticles”. Their properties, along with the related “superselection sectors,” are explained in the section on scalar quantum electrodynamics.
The SM can be characterised as a non-abelian chiral gauge theory. Bietenholz and Wiese explain the various aspects of chirality in great detail. Anomalies in global and local symmetries are carefully discussed in the continuum as well as on a space–time lattice, based on the Ginsparg–Wilson relation and Lüscher’s lattice chiral symmetry. Confinement of quarks and gluons, the hadron spectrum, the parton model and hard processes, chiral perturbation theory and deconfinement at high temperatures uncover perturbative and non-perturbative aspects of quantum chromodynamics (QCD), the theory of strong interactions. Numerical simulations of strongly coupled lattice Yang–Mills theories are very demanding. During the past four decades, much progress has been made in turning lattice QCD into a quantitative reliable tool by controlling statistical and systematic uncertainties, which is clearly explained to the critical reader. The treatment of QCD is supplemented by an introduction to the electroweak theory covering the Higgs mechanism, electroweak symmetry breaking and flavour physics of quarks and leptons.
The number of quark colours, which is three in nature, plays a prominent role in this book. At the quantum level, gauge symmetries can fail due to anomalies, rendering a theory inconsistent. The SM is free of anomalies, but this only works because of a delicate interplay between quark and lepton charges and the number of colours. An important example of this interplay is the decay of the neutral pion into two photons. The subtleties of this process are explained in chapter 24.
The number of quark colours, which is three in nature, plays a prominent role in this book
Most remarkably, the SM predicts baryon-number-violating processes. This arises from the vacuum structure of the weak SU(2) gauge fields, which involves topologically distinct field configurations. Quantum tunnelling between them, together with the anomaly in the baryon–number current, leads to baryon–number violating transitions, as discussed in chapter 26. Similarly, in QCD a non-trivial topology of the gluon field leads to an explicit breaking of the flavour-singlet axial symmetry and, subsequently, to the mass of the η′ meson. Moreover, the gauge field topology gives rise to an additional parameter in QCD, the vacuum-angle θ. Since this parameter induces an electric dipole moment of the neutron that satisfies a strong upper bound, this confronts us with the strong-CP problem: what constrains θ to be so tiny that the experimental upper bound on the neutron dipole moment is satisfied? A solution may be provided by the Peccei–Quinn symmetry and axions, as discussed in a dedicated chapter.
By analogy with the QCD vacuum angle, one can introduce a CP-violating electromagnetic parameter θ into the SM – even though it has no physical effect in pure QED. This brings us to a gem of the book: its discussion of the Witten effect. In the presence of such a θ, the electric charge of a magnetic monopole becomes θ/2π plus an integer. This leads to the remarkable conclusion that for non-zero θ, all monopoles become dyons, carrying both electric and magnetic charge.
The SM is an effective low-energy theory and we do not know at what energy scale elements of a more fundamental theory will become visible. Its gauge structure and quark and lepton content hint at a possible unification of the interactions into a larger gauge group, which is discussed in the final chapter. Once gravity is included, one is confronted with a hierarchy problem: the question of why the electroweak scale is so small compared to the Planck mass, at which the Compton wavelength of a particle and its Schwarzschild radius coincide. Hence, at Planck energies quantum gravitational effects cannot be ignored. Perhaps, solving the electroweak hierarchy puzzle requires working with supersymmetric theories. For all students and scientists struggling with the SM and exploring possible extensions, the nine appendices will be a very valuable source of information for their research.
Quantum entanglement is the quantum phenomenon par excellence. Our world is a quantum world: the matter that we see and touch is the most obvious consequence of quantum physics and it wouldn’t really exist the way it is in a purely classical world. However, in our modern parlance when we talk about quantum sensors or quantum computing, what makes these things “quantum” is the employment of entanglement. Entanglement was first discussed by Einstein and Schrödinger, and later became famous with the celebrated EPR (Einstein–Podolsky–Rosen) paper of 1935.
The magic of entanglement
In an entangled particle system, some properties have to be assigned to the system itself and not to individual particles. When a neutral pion decays into two photons, for example, conservation of angular momentum requires their total spin to be zero. Since the photons travel in opposite directions in the pion’s rest frame, in order for their spins to cancel they must share the same “helicity”. Helicity is the spin projection along the direction of motion, and only two states are possible: left- or right-handed. If one photon is measured to be left-handed, the other must be left-handed as well. The entangled photons must be thought of as a single quantum object: neither do the individual particles have predefined spins nor does the measurement performed on one cause the other to pick a spin orientation. Experiments in more complicated systems have ruled these possibilities out, at least in their simplest incarnations, and this is exactly where the magic of entanglement begins.
Quantum entanglement is the main topic of Einstein’s Entanglement by William Stuckey, Michael Silberstein and Timothy McDevitt, all currently teaching at Elizabethtown College, Pennsylvania. The trio have complementary expertise in physics, philosophy and maths, and this is not their first book on the foundations of physics. They aim to explain why entanglement is so puzzling to physicists and the various ways that have been employed over the years to explain (or even explain away) the phenomenon. They also want to introduce the readers to their own idea on how to solve the riddle and argue about its merits.
Why is entanglement so puzzling to physicists, and what has been employed to explain the phenomenon?
General readers may struggle in places. The book does have accessible chapters, for example one at the start with a quantum-gloves experiment – a nice way to introduce the reader to the problem – as well as a chapter on special relativity. Much of the discussion about quantum mechanics, however, uses advanced concepts such as Hilbert space and the Bloch sphere, that belong to an undergraduate course in quantum mechanics. Philosophical terminology, such as “wave-function realism”, is also used copiously. The explanations and the discussion provided are of good quality and an interested reader in the interpretations of quantum mechanics with some background in physics has a lot to gain. The authors quote copiously from a superb list of references and include many interesting historical facts that make reading the book very entertaining.
In general, the book criticises constructive approaches to interpreting quantum mechanics that explicitly postulate physical phenomena. In the example of neutral-pion decay that I gave previously, the case in which the measurement of one photon causes the other photon to pick a spin would require a constructive explanation. These can be contrasted with principle explanations, which may involve, for example, invoking an overarching symmetry. To quote an example that is used many times in the book, the relativity principle can be used to explain Lorentz length contraction without the need for a physical mechanism to contract the bodies, which would require a constructive explanation.
The authors make the claim that the conceptual issues with entanglement can be solved by sticking to principle explanations and, in particular, with the demand that Planck’s constant is measured to be the same in all inertial reference frames. Whether this simple suggestion is adequate to explain the mysteries of quantum mechanics, I will leave to the reader. Seneca wrote in his Natural Questions that “our descendants will be astonished at our ignorance of what to them is obvious”. If the authors are correct, entanglement may prove to be a case in point.
John Peoples, the third director of Fermilab, who guided the lab through one of the most critical periods in its history, passed away on 25 June 2025. Born in New York City on 22 January 1933, John received his bachelor’s degree in electrical engineering from the Carnegie Institute of Technology (now Carnegie Mellon University) in 1955. After several years at the Glen L. Martin Company, John entered Columbia University where he received his PhD in physics in 1966 for the measurement of the Michel parameter in muon decay under the direction of Allan Sachs. This was followed by a teaching and research position at Cornell University and relocation to Fermilab, initially on sabbatical, in 1971.
John officially joined the Fermilab staff in 1975 as head of the Research Division. His tenure included the discovery of the upsilon particle (b-quark bound state) by Leon Lederman’s team in 1977. He also held responsibilities for the upgrading of the experimental areas to accept beams of up to 1 TeV in anticipation of the completion of the Fermilab Tevatron.
In 1981, following Lederman’s decision to utilise the Tevatron as a proton–antiproton collider, John was appointed head of the TeV-I Project, with responsibility for the construction of the Antiproton Source and the collision hall for the CDF detector. Under John’s leadership, a novel design was developed, building on the earlier pioneering work done at CERN for antiproton accumulation based on stochastic cooling, and proton–antiproton collisions were achieved in the Tevatron four years later, in 1985.
Tireless commitment
John succeeded Lederman to become Fermilab’s third director in July 1989, shortly after the decision to locate the Superconducting Super Collider (SSC) in Waxahachie, Texas, creating immense challenges to Fermilab’s future. John guided the US community to a plan for a new accelerator, the Main Injector (and ultimately the Recycler), that could support a high-luminosity collider programme for the decade of SSC construction while simultaneously providing high-intensity extracted beams for a future neutrino programme that could sustain Fermilab well beyond the SSC’s startup. The cancellation of the SSC in 1993 was a seismic event for US and global high-energy physics, and ensured the Tevatron’s role as the highest energy collider in the world for the next almost two decades. John was asked to lead the termination phase of the SSC lab. In 1994/1995, as director of both Fermilab and the SSC, he worked on this painful task with a special emphasis on helping the many suddenly unemployed people find new career paths.
During John’s tenure as director, Fermilab produced many important physics results. In 1995, the Tevatron Collider experiments, CDF and D∅, announced the discovery of the top quark, the final quark predicted in the Standard Model of particle physics at the mass of more than 175 times that of the proton. To ensure that the experiments could analyse their data quickly and efficiently, John supported replacing costly mainframe computers with “clusters” of inexpensive microprocessors developed in industry for personal computers and later laptops and phones. The final fixed-target run with 800 GeV extracted beam in 1997 and 1998 helped resolve an important and long-standing problem in CP violation in kaon decays and discovered the tau neutrino.
His leadership both enhanced international collaboration and retained a prominent role for Fermilab in collider physics
From 1993–1997, John served as chair of the International Committee for Future Accelerators (ICFA). He stepped down after two terms as Fermilab director in 1999. In 2010, he received the Robert R. Wilson Prize for Achievement in the Physics of Particle Acceleration from the American Physical Society.
Under John’s influence, there were frequent personnel exchanges between Fermilab and CERN throughout the 1980s, as Fermilab staff benefited from CERN’s experience with antiproton production and CERN benefited from Fermilab’s experience with the operations of a superconducting accelerator. These exchanges extended into the 1990s, and following the termination of the SSC, John was instrumental in securing support for US participation in the LHC accelerator and detector projects. His leadership both enhanced international collaboration and retained a prominent role for Fermilab in collider physics after the Tevatron completed operations in 2011.
During the 1980s, astrophysics became an important contributor to our knowledge of particle physics and required more ambitious experiments with strong synergies with the latest round of HEP experiments. In 1991, John formed the Experimental Astrophysics Group at Fermilab. This led to its strong participation in the Sloan Digital Sky Survey (SDSS), the Pierre Auger Cosmic Ray Observatory, the Cryogenic Dark Matter Search (CDMS) and the Dark Energy Survey (DES), of which John became director in 2003. John’s vision of a vibrant community of particle physicists, astrophysicists and cosmologists exploring the inner space-outer space connection is now reality.
Those of us who had the privilege of knowing and working with John were challenged by his intense work ethic and by the equally intense flood of new ideas for running and improving our programmes. He was a gifted and dedicated experimental physicist, skilled in accelerator science, an expert in superconducting magnet design and technology, a superb manager, and a great recruiter and mentor of young engineers and scientists, including the authors of this article. We will miss him!
Ole Hansen, a leading Danish nuclear-reaction physicist, passed away on 11 May 2025, three days short of his 91st birthday. His studies of nucleon transfer between a projectile nucleus and a target nucleus made it possible to determine the bound states in either or both nuclei and confront it with the framework for which the Danish Nobel Prize winners Aage Bohr and Ben Mottelson had developed a unified theory. He conducted experiments at Los Alamos in the US and Aldermaston in the UK, among others, and developed a deep intuitive relationship with Clebsch–Gordan coefficients.
Together with Ove Nathan, Ole oversaw a proposal to build a large tandem accelerator at the Niels Bohr Institute department located at Risø, near Roskilde. The government and research authorities had supported the costly project, but it was scrapped on an afternoon in August 1978 as a last-minute saving to help establish a coalition between the two parties across the centre of Danish politics. Ole’s disappointment was enormous: he decided to take up an offer at Brookhaven National Laboratory (BNL) to continue his nuclear work there, while Nathan threw himself into university politics and later became rector of the University of Copenhagen.
Deep exploration
Ole sent his resignation as a professor at the University of Copenhagen to the Queen – a civil servant had to do so at the time – but was almost immediately confronted with demands for cutbacks at BNL, which would stop the research programme with the tandem accelerator there. Ole did not withdraw his resignation, but together with US colleagues proposed a research programme at very high energies by injecting ions from the tandem into the existing particle accelerator, AGS, thereby achieving energies in the nucleon–nucleon centre-of-mass system of up to 5 GeV. This was the start of the exploration of the deeper structure of nuclear matter, which is revealed as a system consisting of quarks and gluons at temperatures of billions of degrees. This later led to the construction of the first atomic nucleus collision machine, the Relativistic Heavy Ion Collider (RHIC) in the US. Ole himself participated in the E802 and E866 experiments at BNL/AGS, and in the BRAHMS experiment at RHIC.
Ole will be remembered as the first director of the unified Niels Bohr Institute and for establishing the Danish National Research Foundation
Ole will also be remembered as the first director, called back from the US, of the unified Niels Bohr Institute, which was established in 1993 as a fusion of the physics, astronomy and geophysics departments surrounding the Fælledparken commons in Copenhagen after an international panel chaired by him had recommended a merger. Ole realised the necessity of merging the departments in order to create the financial room for manoeuvre needed to be able to hire new and younger researchers again. He left his mark on the construction, which initially had to deal with the very different cultures of the Blegdamsvej, Ørsted and Geophysics institutes. He approached the task efficiently but with a good understanding and respect for the scientific perspectives and the individual researchers.
Back in Denmark, Ole played a significant role in the establishment of the competitive research system we know today, including the establishment of the Danish National Research Foundation (DNRF), of which he was vice-chair in the first years, and with the streamlining of the institute’s research and the establishment of several new areas.
Strong interests
Despite the scale of all his administrative tasks, Ole maintained a lively interest in research and actively supported the establishment of the Centre for CERN Research (now the NICE National Instrument Center) together with the author of this obituary. He was also a member of the CERN Council during the exciting period when the LHC took shape.
Ole will be remembered as an open-minded, energetic and visionary man with an irreverent sense of humour that some feared but others greatly appreciated. Despite his modest manner, he influenced his colleagues with his strong interest in new physics and his sharp scepticism. If consulted, he would probably turn his nose up at the word “loyal”, but he was ever a good and loyal friend. He is survived by his wife, Ruth, and four children.
Michele Arneodo, professor of physics at the University of Piemonte Orientale and chairperson elect of the CMS Collaboration Board, passed away on 12 August 2025. He was 65.
Born in Turin in 1959, Michele graduated in physics from the University of Torino in 1982. He was awarded a Fulbright Fellowship to pursue graduate studies at Princeton University, where he received his MA in 1985 and his PhD in 1992. He began his career as a staff researcher at INFN Torino, before moving to academia as an associate professor at the University of Calabria and then, from 1995, at the University of Piemonte Orientale in Novara, where he became full professor in 2002.
Michele’s research career began with the European Muon Collaboration (NA2 and NA9) and the New Muon Collaboration (NA37) at CERN, investigating the structure of nucleons through the deep inelastic scattering of muons. He went on to play a leading role in the ZEUS experiment at DESY’s HERA collider, focusing on the diffractive physics programme, coordinating groups in Torino and Novara, and overseeing the operation of the Leading Proton Spectrometer. Awarded an Alexander von Humboldt fellowship, he worked at DESY between 1996 and 1999.
With the start of the LHC era, Michele devoted his efforts to CMS, becoming a central figure in diffractive physics and the relentless force behind the construction of the CMS Precision Proton Spectrometer (PPS) and the subsequent merging of the TOTEM and CMS collaborations. He was convener of the diffractive physics group, served on the CMS Publication and Style committees, and from 2014 chaired the Institution Board of the CMS PPS, where he was also resource manager and INFN national coordinator. He had been appointed as chairperson of the CMS Collaboration Board, a role that he was due to begin this year.
A central figure in diffractive physics and the relentless force behind the construction of the Precision Proton Spectrometer
Teaching was central to Michele’s vocation. At the University of Piemonte Orientale, he developed courses on radiation physics for medical students and radiology specialists, building bridges between particle physics and medical applications. He was also widely recognised as a dedicated mentor, always attentive to the careers of younger collaborators.
We will remember Michele as a very talented physicist and a genuinely kind person, who had the style and generosity of a bygone era. Always approachable, he could be found with a smile, a sincere interest in others’ well-being, and a delicate sense of humour that brought lightness to professional exchanges. His students and collaborators valued his constant encouragement and his passion for transmitting enthusiasm for physics and science.
While leaving a lasting mark on physics and on the institutions he served, Michele also cultivated enduring friendships and dedicated himself fully to his family, to whom the thoughts of the CMS and wider CERN communities go at this difficult time.
Miro Andrea Preger, a distinguished accelerator physicist in the Accelerator Division of the Frascati National Laboratories (LNF), passed away on 1 September 2025.
Originally an employee of the Italian National Committee for Nuclear Energy (CNEN), Miro had a long career as a key figure in the INFN institutions.
He made his mark at the pioneering ADONE collider in the 1970s, optimising its performance, developing an innovative luminosity monitor, and improving the machine optics and injection system. Later he served as the director of ADONE, participating in all second-generation experiments, colliding beams for particle physics and producing synchrotron radiation and gamma rays for nuclear physics.
Beyond LNF, Miro played an important role in the design of the Italian synchrotron radiation source ELETTRA in Trieste, and the ESRF in Grenoble; he also collaborated on many other accelerator projects, including CTF3 and CLIC at CERN.
Miro made outstanding contributions to the DAΦNE collider project, leading the realisation of the electron–positron injection system
Miro held many institutional roles, and as head of the Accelerator Physics Service, he taught the art and science of accelerators to many young scientists, with clarity, patience and dedication. As a mentor, he leaves a legacy of accelerator experts who have ensured the success of many LNF initiatives.
Miro made outstanding contributions to the DAφNE collider project from the beginning, leading the design and realisation of the entire electron–positron injection system. He was deeply involved in the very challenging commissioning and achieving the high luminosity that was required by the experiments.
Besides his characteristic dynamism, one of Miro’s distinctive traits was his ability to foster harmonious collaboration among technicians, technologists and researchers.
Away from physics, Miro was an excellent tennis player and skier, along with being a skilled sailor, activities that he often shared with colleagues.
In October, the Circular Electron–Positron Collider (CEPC) study group completed its full suite of technical design reports, marking a key step for China’s Higgs-factory proposal. However, CEPC will not be considered for inclusion in China’s next five-year plan (2026–2030).
“Although our proposal that CEPC be included in the next five-year plan was not successful, IHEP will continue this effort, which an international collaboration has developed for the past 10 years,” says study leader Wang Yifang, of the Institute of High Energy Physics (IHEP) in Beijing. “We plan to submit CEPC for consideration again in 2030, unless FCC is officially approved before then, in which case we will seek to join FCC, and give up CEPC.”
Electroweak precision
CEPC has been under development at IHEP since shortly after the discovery of the Higgs boson at CERN in 2012. To enable precision studies of the new particle, Chinese physicists formally proposed a dedicated electron–positron collider in September 2012. Sharing a concept similar to the Future Circular Collider (FCC) proposed in parallel at CERN, CEPC’s high-luminosity collisions would greatly improve precision in measuring Higgs and electroweak processes.
“CEPC is designed as a multi-purpose particle factory,” explains Wang. “It would not only serve as an efficient Higgs factory but would also precisely study other fundamental particles, and its tunnel can be re-used for a future upgrade to a more powerful super proton–proton collider.”
Following completion of the Conceptual Design Report in 2018, which defined the physics case and baseline layout, the CEPC collaboration entered a detailed technical phase to validate key technologies and complete subsystem designs. The accelerator Technical Design Report (TDR) was released in 2023, followed in October 2025 by the reference detector TDR, providing a mature blueprint for both components.
Although our proposal that CEPC be included in the next five-year plan was not successful, IHEP will continue this effort
Wang Yifang
Compared to the 2018 detector concept, the new technical report proposes several innovations. An electromagnetic calorimeter based on orthogonally oriented crystal bars and a hadronic calorimeter based on high-granularity scintillating glass have been optimised for advanced particle-flow algorithms, improving their energy resolution by a factor of 10 and a factor of two, respectively. A tracking detector employing AC-coupled low-gain avalanche-diode technology will enable simultaneous 10 µm position and 50 ps time measurements, enhancing vertex and flavour tagging. Meanwhile, a readout chip developed in 55 nm technology will achieve state-of-the-art performance at 65% power consumption, enabling better resolution, large-scale integration and reduced cooling-pipe materials. Among other advances, a new type of high-density, high-yield scintillating glass forms the possibility for a full absorption hadronic calorimeter.
To ensure the scientific soundness and feasibility of the design, the CEPC Study Group established an International Detector Review Committee in 2024, chaired by Daniela Bortoletto of the University of Oxford.
Design consolidation
“After three rounds of in-depth review, the committee concluded in September 2025 that the Reference Detector TDR defines a coherent detector concept with a clearly articulated physics reach,” says Bortoletto. “The collaboration’s ambitious R&D programme and sustained technical excellence have been key to consolidating the major design choices and positioning the project to advance from conceptual design into integrated prototyping and system validation.”
CEPC’s technical advance comes amid intense international interest in participating in a Higgs factory. Alongside the circular FCC concept at CERN, Higgs factories with linear concepts have been proposed in Europe and Japan, and both Europe and the US have named constructing or participating in a Higgs factory as a strategic priority. Following China’s decision to defer CEPC, attention now turns to Europe, where the ongoing update of the European Strategy for Particle Physics will prioritise recommendations for the laboratory’s flagship collider beyond the HL-LHC. Domestically, China will consider other large science projects for the 2026 to 2030 period, including a proposed Super Tau–Charm Facility to succeed the Beijing Electron–Positron Collider II.
With completion of its core technical designs, CEPC now turns to engineering design.
“The newly released detector report is the first dedicated to a circular electron–positron Higgs factory,” says Wang. “It showcases the R&D capabilities of Chinese scientists and lays the foundation for turning this concept into reality.”
CERN Council president Costas Fountas sums up the vision of CERN’s Member States.
In March 2024, the CERN Council called on the particle-physics community to develop a visionary and concrete plan that greatly advances human knowledge in fundamental physics through the realisation of the next flagship project at CERN. This community-driven strategy will be submitted to the CERN Council in March 2026, leading to discussions among CERN Member States. The CERN Council will update the European strategy for particle physics (ESPP) based on these deliberations, with a view to approving CERN’s next flagship collider in 2028.
This third update to the ESPP builds on a process initiated by the CERN Council in 2006 and updated in 2013 and 2020. It is designed to convey to the CERN Council the views of the community on strategic questions that are key to the future of high-energy physics (HEP). The process involves all CERN Member States and Associate Member States, with the goal of developing a roadmap for the field for many years to come. The CERN Council asked that the newly updated ESPP should take into account the status of implementation of the 2020 ESPP, recent accomplishments at the LHC and elsewhere, progress in the construction of the High-Luminosity LHC (HL-LHC), the outcome of the Future Circular Collider (FCC) Feasibility Study, recent technological developments in accelerator, detector and computing technology, and the international landscape of the field. Scientific inputs were requested from across the community.
On behalf of the CERN Council, I would like to thank the high-energy community for understanding that this is a critical time for our field and participating very actively. Throughout this time, the various national groups have held a large number of meetings to debate which would be the best accelerator to be hosted at CERN after the HL-LHC. They also discussed and proposed alternative options as requested by the CERN Council, which followed the process closely.
By June 2025 we were delighted to hear from the ESPP secretariat that the participation of the community had been overwhelming and that a very large number of proposals had been submitted (CERN Courier May/June 2025 p8). These submissions show a broad consensus that CERN should be maintained as the global centre for collider physics through the realisation of a new flagship project. Europe’s strategy should be ambitious, innovative and forward looking. An overwhelming majority of the communities from CERN Member States express their strong support for the FCC programme, starting with an electron–positron collider (FCC-ee) as a first stage. Their strong support is largely based on its superb physics potential and its long-term prospects, given the potential to explore the energy frontier with a hadron collider (FCC-hh) following a precision era at FCC-ee.
CERN’s future flagship collider – Member State preferences
Based on an unofficial analysis by CERN Courier of national submissions to the 2026 update to the European strategy for particle physics. Each national submission is accorded equal weight, with that weight divided equally when multiple options are specified. With the deadline for national submissions passing before Slovenia acceded as CERN’s 25th Member State, 24 national submissions are included. These data are not endorsed by the authors, the CERN Council, the strategy secretariat or CERN management.
This strategy coherently develops the vision of ESPP 2020, which recommended to the CERN Council that an electron–positron Higgs factory be the highest-priority next collider. The 2020 ESPP update further recommended that Europe, together with its international partners, should investigate the technical and financial feasibility of a future hadron collider at CERN with a centre-of-mass energy of at least 100 TeV and with an electron–positron Higgs and electroweak factory as a possible first stage. Such a feasibility study of the colliders and related infrastructure should be established as a global endeavour and be completed on the timescale of the next strategy update.
Based on ESPP 2020, the CERN Council mandated the CERN management to undertake a feasibility study for the FCC and approved an initial budget of CHF 100 million over a five-year period. Throughout the past five years, the FCC feasibility study was undertaken by CERN management under the oversight of the CERN Council. Council heard presentations on its progress at every session and carefully scrutinised a very successful mid-term review (CERN Courier March/April 2024 p25). The FCC collaboration completed the FCC feasibility study ahead of schedule and summarised the results of the study in a three-volume report that was released in March 2025 (CERN Courier May/June 2025 p8). The results are currently under review by panels which will scrutinise both the scientific aspects of the project as well as its budget estimates. The project will be presented to the Scientific Policy and Finance committees in September 2025 and to the CERN Council in November 2025.
It is rewarding to see that the scientific opinion of the community is in sync with ESPP 2020, the decision of the CERN Council to initiate the FCC feasibility study, and the efforts of CERN management to steer and complete it. This is a sign of the strength of the HEP community. While respecting a healthy diversity of opinion, a clear consensus has emerged across the community that the FCC is the highest priority project.
Crucially, however, the CERN Council requested that the community provide not only the scientifically most attractive option, but also hierarchically ordered alternative options. Specifically, the Council requested that the strategy update should include the preferred option for the next collider at CERN and prioritised alternative options to be pursued if the chosen preferred plan turns out not to be feasible or competitive. No consensus has yet been reached here, however two projects have the required readiness to be candidates for alternative programmes: the Linear Collider Facility (LCF, 250 GeV) and the Compact Linear Collider (CLIC, 380 GeV), with additional R&D required in the latter case. A third proposal, LEP3, also requires further study, but could be a promising candidate for a Higgs factory in the existing LEP/LHC tunnel, albeit at a significantly reduced luminosity relative to FCC-ee.
On behalf of the CERN Council, I would like to thank the high-energy community for understanding that this is a critical time for our field and participating very actively
The R&D for several of these projects has been supported by CERN for a long time. Research on linear colliders has been an active programme for the past 30 years and has received significant support, not only ensuring their readiness for consideration as future HEP facilities, but also sparking an exceptional R&D programme in the applications of fundamental research, for example in accelerators for cancer treatment (CERN Courier July/August 2024 p46). Over the past five years, CERN has also invested in muon colliders and hosts the International Muon Collider Collaboration. CERN also leads research into the application of plasma-wakefield acceleration for fundamental physics, having supported the AWAKE experiment for 10 years now (CERN Courier May/June 2024 p25).
The next milestone for updating the ESPP is 14 November: the deadline for submission of the final national inputs. The final drafting session of the strategy update will then take place from 1 to 5 December 2025 at Monte Verità Ascona, where the community recommendations will be finalised. These will be presented to the CERN Council in March 2026 and discussed at a dedicated meeting of the CERN Council in May 2026 in Budapest.
Meanwhile, a key milestone for community deliberations recently passed. The full spectrum of community inputs was presented and debated at an Open Symposium held in Venice in June. As strategy secretary Karl Jakobs reports on the following pages, the symposium was a smashing success with lively discussions and broad participation from our community. On behalf of Council, I would like to convey my sincere thanks to the Italian delegation for the superb organisation of the symposium.
Costas Fountashas served as president of the CERN Council since his appointment in January this year, and as the Greek scientific delegate to the Council since 2016. A professor of physics at the University of Ioannina and longstanding member of the CMS collaboration, he previously served as vice-president of the Council from 2022 to 2024. (Image credit: M Brice, CERN)
Venice symposium debates decades of collider strategy
Strategy secretary Karl Jakobs reports from a vibrant Open Symposium in Venice.
The Open Symposium of the European Strategy for Particle Physics (ESPP) brought together more than 600 physicists from almost 40 countries in Venice, Italy, from 23 to 27 June, to debate the future of European particle physics. In the focus was the discussion on the next large-scale accelerator project at CERN to follow the HL-LHC, which is scheduled to operate until the end of 2041. The strategy update should – according to the remit defined by the CERN Council – define a preferred option for the next collider and prioritised alternative options to be pursued if the preferred plan turns out not to be feasible or competitive. In addition, the strategy update should indicate areas of priority for exploration complementary to colliders and other experiments to be considered at CERN and at other European laboratories, as well as for participation in projects outside Europe.
The Open Symposium is an important step in the strategy process. The aim is to involve the full community in discussions of the 266 scientific contributions that had been submitted by the community to the ESPP process before the symposium (CERN Courier May/June 2025 p8).
In the opening session of the symposium CERN Director-General Fabiola Gianotti summarised the impressive achievements of the CERN community in the implementation of the recommendations from the 2020 update to the ESPP. Eric Laenen (Nikhef) stressed that the outstanding questions in particle physics require a broad and diverse experimental programme, including the HL-LHC, a new flagship collider, and a wide variety of other experiments including those in neighbouring fields. A broad consensus emerged that a future collider programme should be realised that can fully leverage both precision and energy, covering the widest range of observables at different energy scales. To match experimental precision, significant progress on the theoretical side is also required, in particular regarding higher-order calculations.
An important part of the symposium was devoted to presentations of possible future large-scale accelerator projects. Detailed presentations were given on the FCC-ee and FCC-hh colliders, either in the integrated FCC programme or proceeding directly to FCC-hh as a standalone realisation at an earlier time. Linear colliders were presented as alternative options, with a Linear Collider Facility (LCF) based on the design of the International Linear Collider (ILC) and CLIC both considered. In addition, smaller collider options were presented, based on re-using the LHC/LEP tunnel. A first proposal, LEP3, suggests accelerating electrons and positrons up to energies of 230 GeV, while a second proposal, LHeC, proposes the realisation of electron–proton collisions in one interaction point of the LHC. LHeC would require the construction of an additional new energy-recovery linac for the acceleration of electrons.
Moving focus from the precision frontier to the energy frontier, several ways to reach the 10 TeV “parton scale” were presented. (Comparisons between the energy reach of hadron and lepton colliders must discuss parton–parton centre-of-mass energies, where partons refer to the pointlike constituents of hadrons, as only a fraction of the energy of collisions between composite particles can be used to probe the existence of new particles and fields.) If FCC-ee is realised, a natural path is to proceed with proton-proton collisions with proton–proton centre-of-mass energies in the range of 85 to 120 TeV, depending on the available high-field magnet technology. As an alternative, a muon collider could provide a path towards high-energy lepton collisions, however, demonstrations of how to address the significant technological challenges, such as six-dimensional cooling in transverse and longitudinal phase space, and other items associated with the various acceleration steps, need to be achieved. Likewise, plasma-based acceleration techniques for electrons and positrons capable of exceeding the 1 TeV energy scale are yet to be demonstrated.
A broad consensus emerged that a future collider programme should be realised that can fully leverage both precision and energy
The symposium was organised to foster strong engagement by the community in discussion sessions. Six physics topics – covering electroweak physics, strong interactions, flavour physics, physics beyond the Standard Model, neutrino physics and cosmic messengers, and dark matter and the dark sector, as well as the three technology areas on accelerators, detectors and computing, were summarised in rapporteur talks, followed by 45-minute discussions, where the people present in Venice strongly engaged.
For the study of precision Higgs measurements, the performance of all the considered electron–positron (e+e–) colliders is comparable. While a sub-percent precision can be reached in several measurements of Higgs couplings to fermions and bosons, HL-LHC measurements would prevail for rare processes. On the determination of the important Higgs-boson (H) self-coupling, the precision obtained at the HL-LHC will prevail until either e+e– linear colliders can improve it in direct HH production measurements at collision energies above 500 GeV, or before precisions at the level of a few percent can be reached at FCC-hh or a muon collider. It was further stressed that precision measurements in the Higgs, electroweak (Z, W, top) and flavour physics constitute three facets for indirect discoveries and that their synergy is essential to maximise the discovery potential of future colliders. Due to its high luminosity at low energies and its four experiments, the FCC-ee shows a superior physics performance in the electroweak programme.
In flavour physics, a lot of progress will be achieved in the coming decade by the LHCb and Belle-II experiments. While the tera-Z production at a future FCC-ee would provide a major step forward, the giga-Z data samples available at linear colliders do not seem to be a good option for flavour physics. The FCC-ee and LHeC would also achieve high precision on QCD measurements, leading, for example, to a per-mille level determination of the strong coupling constant αs. The important investigations of the quark–gluon plasma at the HL-LHC could be continued in parallel to an e+e– collider operation at CERN at the SPS fixed target programme, before FCC-hh would eventually allow for novel studies in the high-temperature QCD domain.
Keeping diversity in the particle-physics programme was also felt to be essential: the next collider project should not come at the expense of a diverse scientific programme in Europe. Given that we do not know where new physics will show up, ensuring a diverse and comprehensive physics programme is vital, including fixed-target, neutrino, flavour, astroparticle and nuclear-physics experiments. Experiments in these areas have the potential for groundbreaking discoveries.
The discussions in Venice revealed a community united in its desire for a future flagship collider at CERN
At the technology frontier, essential work on accelerator R&D, such as on high-field and high-temperature superconducting magnets and RF systems, remain a high priority and appropriate investments must be made. R&D on advanced acceleration concepts should continue with adequate effort to prepare future projects. In the detector area, the establishment of the Detector Research & Development (DRD) collaborations as a result of the implementation of the recommendations of the 2020 ESPP update were considered to provide a solid basis to tackle the challenges related to the developments for high-performing detectors for future colliders and beyond. It is also expected that the required software and computing challenges for future colliders can be mastered, provided that adequate person power and funding are available and adaptations to new technologies, in particular GPUs, AI and – on a longer timescale – quantum computing, can be made.
The discussions in Venice revealed a community united in its desire for a future flagship collider at CERN. Over the past years, very significant progress has been made in this direction, and the discussions on the prioritisation of collider options will continue over the next months. In addition to the FCC-ee, linear colliders (LCF, CLIC) present mature options for a Higgs factory at CERN. LEP3 and LHeC could alternatively be considered as intermediate collider projects, followed by a larger accelerator capable of exploring the 10 TeV parton scale.
The differences in the physics potential between the various collider options will be documented in the Physics Briefing Book that will be released by the Physics Preparatory Group by the end of September. In parallel, the technical readiness, risks, timescales and costs will be reviewed by the European Strategy Group (ESG). Alongside the final national inputs, these assessments will provide the foundation for the final recommendations to be drafted by the ESG in early December 2025.
Karl Jakobs is the secretary of the 2026 update to the European strategy for particle physics. A professor at the University of Freiburg, Jakobs served as spokesperson of the ATLAS collaboration from 2017 to 2021 and as chairman of the European Committee for Future Accelerators from 2021 to 2023. (Image credit: K Jakobs)
The world of particle physics was revolutionised in November 1974 by the discovery of the J/ψ particle. At the time, most of the elements of the Standard Model of particle physics had already been formulated, but only a limited set of fundamental fermions were confidently believed to exist: the electron and muon, their associated neutrinos, and the up, down and strange quarks that were thought to make up the strongly interacting particles known at that time. The J/ψ proved to be a charm–anticharm bound state, vindicating the existence of a quark flavour first hypothesised by Sheldon Glashow and James Bjorken in 1964 (CERN Courier January/February 2025 p35). Its discovery eliminated any lingering doubts regarding the quark model of 1964 (see “Nineteen sixty-four“) and sparked the development of the Standard Model into its modern form.
This new “charmonium” state was the first example of quarkonium: a heavy quark bound to an antiquark of the same flavour. It was named by analogy to positronium, a bound state of an electron and a positron, which decays by mutual annihilation into two or three photons. Composed of unstable quarks, bound by gluons rather than photons, and decaying mainly via the annihilation of their constituent quarks, quarkonia have fascinated particle physicists ever since.
The charmonium interpretation of the J/ψ was cemented by the subsequent discovery of a spectrum of related cc–states, and ultimately by the observation of charmed particles in 1976. The discovery of charmonium was followed in 1977 by the identification of bottomonium mesons and particles containing bottom quarks. While toponium – a bound state of a top quark and antiquark – was predicted in principle, most physicists thought that its observation would have to wait for the innate precision of a next-generation e+e– collider following the LHC, in view of the top quark’s large mass and exceptionally rapid decay, more than 1012 times quicker than the bottom quark. The complex environment at a hadron collider, where the composite nature of protons precludes knowledge of the initial collision energy of pairs of colliding partons within them, would make toponium particularly difficult to identify at the LHC.
However, in the second half of 2024, the CMS collaboration reported an enhancement near the threshold for tt production at the LHC, which is now most plausibly interpreted as the lowest-lying toponium state. The existence of this enhancement has recently been corroborated by the ATLAS collaboration (see”ATLAS confirms top–antitop excess“).
Here are the personal memories of an eyewitness who followed these 50 years of quarkonium discoveries firsthand.
Strangeonium?
In hindsight, the quarkonium story can be thought to have begun in 1963 with the discovery of the φ meson. The φ was an unexpectedly stable and narrow resonance, decaying mainly into kaons rather than the relatively light pions, despite lying only just above the KK threshold. Heavier quarkonia cannot decay into a pair of mesons containing single heavy quarks, as their masses lie below the energy threshold for such “open flavour” decays.
The preference of the φ to decay into kaons was soon interpreted by Susumu Okubo as a consequence of approximate SU(3) flavour symmetry, developing mathematical ideas based on unitary 3 × 3 matrices with a determinant one. At the beginning of 1964, quarks were proposed and George Zweig suggested that the φ was a bound state of a strange quark and a strange anti-quark (or aces as he termed them). After 1974, the portmanteau word “strangeonium” was retrospectively applied to the φ and similar heavier ss bound states, but the name has never really caught on.
In the year or so prior to the discovery of the J/ψ in November 1974, there was much speculation about data from the Cambridge Electron Accelerator (CEA) at Harvard and the Stanford Positron–Electron Asymmetric Ring (SPEAR) at SLAC. Data from these e+e– colliders indicated a rise in the ratio, R, of cross-sections for hadron and μ+μ– production (see “Why is R rising?” figure). Was this a failure of the parton model that had only recently found acceptance as a model for the apparently scale-invariant internal structure of hadrons observed in deep-inelastic scattering experiments? Did partons indeed have internal structure? Or were there “new” partons that had not been seen previously, such as charm or coloured quarks? I was asked on several occasions to review the dozens of theoretical suggestions on the market, including at the ICHEP conference in the summer of 1974. In preparation, I toted a large Migros shopping bag filled with dozens of theoretical papers around Europe. Playing the part of an objective reviewer, I did not come out strongly in favour of any specific interpretation, however, during talks that autumn in Copenhagen and Dublin, I finally spoke out in favour of charm as the best-motivated explanation of the increase in R.
November revolution
Then, on 11 November 1974, the news broke that two experimental groups, one working at BNL under the leadership of Sam Ting and the other at SLAC led by Burt Richter, had discovered, in parallel, the narrow vector boson that bears the composite name J/ψ (see “Charmonium” figure). The worldwide particle-physics community went into convulsions (CERN Courier November/December 2024 p41) – and the CERN Theory Division was no exception. We held informal midnight discussion sessions around an open-mic phone with Fred Gilman in the SLAC theory group, who generously shared with us the latest J/ψ news. Away from the phone, like many groups around the world, we debated the merits and demerits of many different theoretical ideas. Rather than write a plethora of rival papers about these ideas, we decided to bundle our thoughts into a collective preprint. Instead of taking individual responsibility for our trivial thoughts, the preprint was anonymous, the place of the authors’ names being taken by a mysterious “CERN Theory Boson Workshop”. Eagle eyes will spot that the equations were handwritten by Mary K Gaillard (CERN Courier July/August 2025 p47). Informally, we called ourselves Co-Co, for communication collective. With “no pretentions to originality or priority,” we explored five hypotheses: a hidden charm vector meson, a coloured vector meson, an intermediate vector boson, a Higgs meson and narrow resonances in strong interactions.
My immediate instinct was to advocate the charmonium interpretation of the J/ψ, and this was the first interpretation to be described in our paper. This was on the basis of the Glashow–Iliopoulos–Maiani (GIM) mechanism, which accounted for the observed suppression of flavour-changing neutral currents by postulating the existence a charm quark with a mass around 2 GeV (see CERN Courier July/August 2024 p30), and the Zweig rule, which suggested phenomenologically that quarkonia do not easily decay by quark–antiquark annihilation via gluons into other flavours of quarks. So I was somewhat surprised when one of the authors of the GIM paper wrote a paper proposing that it might be an intermediate electroweak vector boson. A few days after the J/ψ discovery came the news of the (almost equally narrow) ψ′ discovery, which I was told as I was walking along the theory corridor to my office one morning. My informant was a senior theorist who was convinced that this discovery would kill the charmonium interpretation of the J/ψ. However, before I reached my office I realised that an extension of the Zweig rule would also suppress ψ′→J/ψ + light meson decays, so the ψ′ could also be narrow.
Keen competition
The charmonium interpretation of the J/ψ and ψ′ states predicted that there should be intermediate P-wave states (with one unit of orbital angular momentum) that could be detected in radiative decays of the ψ′. In the first half of 1975 there was keen competition between teams at SLAC and DESY to discover these states. That summer I was visiting SLAC, where I discovered one day under the cover of a copying machine, before their discovery was announced, a sheet of paper with plots showing clear evidence for the P-wave states. I made a copy, went to Burt Richter’s office and handed him the sheet of paper. I also asked whether he wanted my copy. He graciously allowed me to keep it, as long as I kept quiet about it, which I did until the discovery was officially announced a few weeks later.
The story of quarkonium can be thought to have begun in 1963 with the discovery of the φ meson
Discussion about the interpretation of the new particles, in particular between advocates of charm and Han–Nambu coloured quarks – a different way to explain the new particles’ astounding stability by giving them a new quantum number – rumbled on for a couple of years until the discovery of charmed particles in 1976. During this period we conducted some debates in the main CERN auditorium moderated by John Bell. I remember one such debate in particular, during which a distinguished senior British theorist spoke for coloured quarks and I spoke for charm. I was somewhat taken aback when he described me as representing the “establishment”, as I was under 30 at the time.
Over the following year, my attention wandered to grand unified theories, and my first paper on the subject was with Michael Chanowitz and Mary K Gaillard, which we completed in May 1977. We realised while writing this paper that simple grand unified theories – which unify the electroweak and strong interactions – would relate the mass of the τ heavy lepton that had been discovered in 1975 to the mass of the bottom quark, which was confidently expected but whose mass was unknown. Our prediction was mb/mτ = 2 to 5, but we did not include it in the abstract. Shortly afterwards, while our paper was in proof, the discovery of the ϒ state (or states) by a group at Fermilab led by Leon Lederman (see “Bottomonium” figure) became known, implying that mb ~ 4.5 GeV. I added our successful mass prediction by hand in the margin of the corrected proof. Unfortunately, the journal misunderstood my handwriting and printed our prediction as mb/mτ = 2605, a spectacularly inaccurate postdiction! It remains to be seen whether the idea of a grand unified theory is correct: it also predicted successfully the electroweak mixing angle θW and suggested that neutrinos might have mass, but direct evidence, such as the decay of the proton, has yet to be found.
Peak performance
Meanwhile, buoyed by the success of our prediction for mb, Mary K Gaillard, Dimitri Nanopoulos, Serge Rudaz and I set to work on a paper about the phenomenology of the top and bottom quarks. One of our predictions was that the first two excited states of the ϒ, the ϒ′ and ϒ′′, should be detectable by the Lederman experiment because the Zweig rule would suppress their cascade decays to lighter bottomonia via light-meson emission. Indeed, the Lederman experiment found that the ϒ bump was broader than the experimental resolution, and the bump was eventually resolved into three bottomonium peaks.
It was in the same paper that we introduced the terminology of “penguin diagrams”, wherein a quark bound in a hadron changes flavour not at tree level via W-boson exchange but via a loop containing heavy particles (like W bosons or top quarks), emitting a gluon, photon or Z boson. Similar diagrams had been discussed by the ITEP theoretical school in Moscow, in connection with K decays, and we realised that they would be important in B-hadron decays. I took an evening off to go to a bar in the Old Town of Geneva, where I got involved in a game of darts with the experimental physicist Melissa Franklin. She bet me that if I lost the game I had to include the word “penguin” in my next paper. Melissa abandoned the darts game before the end, and was replaced by Serge Rudaz, who beat me. I still felt obligated to carry out the conditions of the bet, but for some time it was not clear to me how to get the word into the b-quark paper that we were writing at the time. Then, another evening, after working at CERN, I stopped to visit some friends on my way back to my apartment, where I inhaled some (at that time) illegal substance. Later, when I got home and continued working on our paper, I had a sudden inspiration that the famous Russian diagrams look like penguins. So we put the word into our paper, and it has now appeared in almost 10,000 papers.
What of toponium, the last remaining frontier in the world of quarkonia? In the early 1980s there were no experimental indications as to how heavy the top quark might be, and there were hopes that it might be within the range of existing or planned e+e– colliders such as PETRA, TRISTAN and LEP. When the LEP experimental programme was being devised, I was involved in setting “examination questions” for candidate experimental designs that included asking how well they could measure the properties of toponium. In parallel, the first theoretical papers on the formalism for toponium production in e+e– and hadron–hadron collisions appeared.
Toponium will be a very interesting target for future e+e– colliders
But the top quark did not appear until the mid-1990s at the Tevatron proton–antiproton collider at Fermilab, with a mass around 175 GeV, implying that toponium measurements would require an e+e– collider with an energy much greater than LEP, around 350 GeV. Many theoretical studies were made of the cross section in the neighbourhood of the e+e–→ tt threshold, and how precisely the top quark mass, electroweak and Higgs couplings could be measured.
Meanwhile, a smaller number of theorists were calculating the possible toponium signal at the LHC, and the LHC experiments ATLAS and CMS started measuring tt production with high statistics. CMS and ATLAS embarked on programmes to search for quantum-mechanical correlations in the final-state decay products of the top quarks and antiquarks, as should occur if the tt state were to be produced in a specific spin-parity state. They both found decay correlations characteristic of tt production in a pseudoscalar state: it was the first time such a quantum correlation had been observed at such high energies.
The CMS collaboration used these studies to improve the sensitivities of dedicated searches they were making for possible heavy Higgs bosons decaying into ttfinal states, as would be expected in many extensions of the Standard Model. Intriguingly, hints of a possible excess of events around the ttthreshold with the type of correlation expected from a pseudoscalar ttstate began to emerge in the CMS data, but initially not with high significance.
Pseudoscalar states
I first heard about this excess at an Asia–CERN physics school in Thailand, and started wondering whether it could be due to the lowest-lying toponium state, which would decay predominantly into unstable top quarks and antiquarks rather than via their annihilation, or to a heavy pseudoscalar Higgs boson, and how one might distinguish between these hypotheses. A few years previously, Abdelhak Djouadi, Andrei Popov, Jérémie Quevillon and I had studied in detail the possible signatures of heavy Higgs bosons in tt final states at the LHC, and shown that they would have significant interference effects that would generate dips in the cross-section as well as bumps.
The significance of the CMS signal subsequently increased to over 5σ, showing up in a tailored search for new pseudoscalar states decaying into tt pairs with specific spin correlations, and recently this CMS discovery has been confirmed by the ATLAS Collaboration, with a significance over 7σ. Unfortunately, the experimental resolution in the tt invariant mass is not precise enough to see any dip due to pseudoscalar Higgs production, and Djouadi, Quevillon and I have concluded that it is not yet possible to discriminate between the toponium and Higgs hypotheses on purely experimental grounds.
However, despite being a fan of extra Higgs bosons, I have to concede that toponium is the more plausible interpretation of the CMS threshold excess. The mass is consistent with that expected for toponium, the signal strength is consistent with theoretical calculations in QCD, and the tt spin correlations are just what one expects for the lowest-lying pseudoscalar toponium state that would be produced in gluon–gluon collisions.
Caution is still in order. The pseudoscalar Higgs hypothesis cannot (yet) be excluded. Nevertheless, it would be a wonderful golden anniversary present for quarkonium if, some 50 years after the discovery of the J/ψ, the appearance of its last, most massive sibling were to be confirmed.
Toponium will be a very interesting target for future e+e– colliders, which will be able to determine its properties with much greater accuracy than a hadron collider could achieve, making precise measurements of the mass of the top quark and its electroweak couplings possible. The quarkonium saga is far from over.
In 2009, the JADE experiment had been inoperational for 23 years. The PETRA electron–positron collider that served it had already completed a second life as a pre-accelerator for the HERA electron–proton collider and was preparing for a third life as an X-ray source. JADE and the other PETRA experiments were a piece of physics history, well known for seminal measurements of three-jet quark–quark-gluon events, and early studies of quark fragmentation and jet hadronisation. But two decades after being decommissioned, the JADE collaboration was yet to publish one of its signature measurements.
At high energies and short distances, the strong force becomes weaker. Quarks behave almost like free particles. This “asymptotic freedom” is a unique hallmark of QCD. In 2009, as now, JADE’s electron–positron data was unique in the low-energy range, with other data sets lost to history. When reprocessed with modern next-to-next-to-leading-order QCD and improved simulation tools, the DESY experiment was able to rival experiments at CERN’s higher-energy Large Electron–Positron (LEP) collider for precision on the strong coupling constant, contributing to a stunning proof of QCD’s most fundamental behaviour. The key was a farsighted and original initiative by Siggi Bethke to preserve JADE’s data and analysis software.
New perspectives
This data resurrection from JADE demonstrated how data can be reinterpreted to give new perspectives decades after an experiment ends. It was a timely demonstration. In 2009, HERA and SLAC’s PEP-II electron–positron collider had been recently decommissioned, and Fermilab’s Tevatron proton–antiproton collider was approaching the end of its operations. Each facility nevertheless had a strong analysis programme ahead, and CERN’s Large Hadron Collider (LHC) was preparing for its first collisions. How could all this data be preserved?
The uniqueness of these programmes, for which no upgrade or followup was planned for the coming decades, invited the consideration of data usability at horizons well beyond a few years. A few host labs risked a small investment, with dedicated data-preservation projects beginning, for example, at SLAC, DESY, Fermlilab, IHEP and CERN (see “Data preservation” dashboard). To exchange data-preservation concepts, methodologies and policies, and to ensure the long-term preservation of HEP data, the Data Preservation in High Energy Physics (DPHEP) group was created in 2014. DPHEP is a global initiative under the supervision of the International Committee for Future Accelerators (ICFA), with strong support from CERN from the beginning. It actively welcomes new collaborators and new partner experiments, to ensure a vibrant and long-term future for the precious data sets being collected at present and future colliders.
At the beginning of our efforts, DPHEP designed a four-level classification of data abstraction. Level 1 corresponds to the information typically found in a scientific publication or its associated HEPData entry (a public repository for high-energy physics data tables). Level 4 includes all inputs necessary to fully reprocess the original data and simulate the experiment from scratch.
The concept of data preservation had to be extended too. Simply storing data and freezing software is bound to fail as operating systems evolve and analysis knowledge disappears. A sensible preservation process must begin early on, while the experiments are still active, and take into account the research goals and available resources. Long-term collaboration organisation plays a crucial role, as data cannot be preserved without stable resources. Software must adapt to rapidly changing computing infrastructure to ensure that the data remains accessible in the long term.
Return on investment
But how much research gain could be expected for a reasonable investment in data preservation? We conservatively estimate that for dedicated investments below 1% of the cost of the construction of a facility, the scientific output increases by 10% or more. Publication records confirm that scientific outputs at major experimental facilities continue long after the end of operations (see “Publications per year, during and after data taking” panel). Publication rates remain substantial well beyond the “canonical” five years after the end of the data taking, particularly for experiments that pursued dedicated data-preservation programmes. For some experiments, the lifetime of the preservation system is by now comparable with the data-taking period, illustrating the need to carefully define collaborations for the long term.
Publication records confirm that scientific outputs at major experimental facilities continue long after the end of operations
The most striking example is BaBar, an electron–positron-collider experiment at SLAC that was designed to investigate the violation of charge-parity symmetry in the decays of B mesons, and which continues to publish using a preservation system now hosted outside the original experiment site. Aging infrastructure is now presenting challenges, raising questions about the very-long-term hosting of historical experiments – “preservation 2.0” – or the definitive end of the programme. The other historical b-factory, Belle, benefits from a follow-up experiment on site.
Publications per year, during and after data taking
The publication record at experiments associated with the DPHEP initiative. Data-taking periods of the relevant facilities are shaded, and the fraction of peer-reviewed articles published afterwards is indicated as a percentage for facilities that are not still operational. The data, which exclude conference proceedings, were extracted from Inspire-HEP on 31 July 2025.
HERA, an electron– and positron–proton collider that was designed to study deep inelastic scattering (DIS) and the structure of the proton, continues to publish and even to attract new collaborators as the community prepares for the Electron Ion Collider (EIC) at BNL, nicely demonstrating the relevance of data preservation for future programmes. The EIC will continue studies of DIS in the regime of gluon saturation (CERN Courier January/February 2025 p31), with polarised beams exploring nucleon spin and a range of nuclear targets. The use of new machine-learning algorithms on the preserved HERA data has even allowed aspects of the EIC physics case to be explored: an example of those “treasures” not foreseen at the end of collisions.
IHEP in China conducts a vigorous data-preservation programme around BESIII data from electron–positron collisions in the BEPCII charm factory. The collaboration is considering using artificial intelligence to rank data priorities and user support for data reuse.
Remarkably, LEP experiments are still publishing physics analyses with archived ALEPH data almost 25 years after the completion of the LEP programme on 4 November 2000. The revival of the CERNLIB collection of FORTRAN data-analysis software libraries has also enabled the resurrection of the legacy software stacks of both DELPHI and OPAL, including the spectacular revival of their event displays (see “Data resurrection” figure). The DELPHI collaboration revised their fairly restrictive data-access policy in early 2024, opening and publishing their data via CERN’s Open Data Portal.
Some LEP data is currently being migrated into the standardised EDM4hep (event data model) format that has been developed for future colliders. As well as testing the format with real data, this will ensure data preservation and support software development, analysis training and detector design for the electron–positron collider phase of the proposed Future Circular Collider using real events.
The future is open
In the past 10 years, data preservation has grown in prominence in parallel with open science, which promotes free public access to publications, data and software in community-driven repositories, and according to the FAIR principles of findability, accessibility, interoperability and reusability. Together, data preservation and open science help maximise the benefits of fundamental research. Collaborations can fully exploit their data and share its unique benefits with the international community.
The two concepts are distinct but tightly linked. Data preservation focuses on maintaining data integrity and usability over time, whereas open data emphasises accessibility and sharing. They have in common the need for careful and resource-loaded planning, with a crucial role played by the host laboratory.
Data preservation and open science both require clear policies and a proactive approach. Beginning at the very start of an experiment is essential. Clear guidelines on copyright, resource allocation for long-term storage, access strategies and maintenance must be established to address the challenges of data longevity. Last but not least, it is crucially important to design collaborations to ensure smooth international cooperation long after data taking has finished. By addressing these aspects, collaborations can create robust frameworks for preserving, managing and sharing scientific data effectively over the long term.
Today, most collaborations target the highest standards of data preservation (level 4). Open-source software should be prioritised, because the uncontrolled obsolescence of commercial software endangers the entire data-preservation model. It is crucial to maintain all of the data and the software stack, which requires continuous effort to adapt older versions to evolving computing environments. This applies to both software and hardware infrastructures. Synergies between old and new experiments can provide valuable solutions, as demonstrated by HERA and EIC, Belle and Belle II, and the Antares and KM3NeT neutrino telescopes.
From afterthought to forethought
In the past decade, data preservation has evolved from simply an afterthought as experiments wrapped up operations into a necessary specification for HEP experiments. Data preservation is now recognised as a source of cost-effective research. Progress has been rapid, but its implementation remains fragile and needs to be protected and planned.
In the past 10 years, data preservation has grown in prominence in parallel with open science
The benefits will be significant. Signals not imagined during the experiments’ lifetime can be searched for. Data can be reanalysed in light of advances in theory and observations from other realms of fundamental science. Education, training and outreach can be brought to life by demonstrating classic measurements with real data. And scientific integrity is fully realised when results are fully reproducible.
The LHC, having surpassed an exabyte of data, now holds the largest scientific data set ever accumulated. The High-Luminosity LHC will increase this by an order of magnitude. When the programme comes to an end, it will likely be the last data at the energy frontier for decades. History suggests that 10% of the LHC’s scientific programme will not yet have been published when collisions end, and a further 10% not even imagined. While the community discusses its strategy for future colliders, it must therefore also bear in mind data preservation. It is the key to unearthing hidden treasures in the data of the past, present and future.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.