Topics

Voices from a new generation

Seventy years of CERN

In January 1962, CERN was for the first time moving from machine construction to scientific research with the machines. Director-General Victor Weisskopf took up the pen in the first CERN Courier after a brief hiatus. “This institution is remarkable in two ways,” he wrote. “It is a place where the most fantastic experiments are carried out. It is a place where international co-operation actually exists.”

A new generation of early-career researchers (ECRs) shares his convictions. Now, as then, they do much of the heavy lifting that builds the future of the field. Now, as then, they need resilience and vision. As Weisskopf wrote in these pages, the everyday work of high-energy physics (HEP) can hide its real importance – its romantic glory, as the renowned theorist put it. “All our work is for an idealistic aim, for pure science without commercial or any other interests. Our effort is a symbol of what science really means.”

As CERN turns 70, the Courier now hands the pen to the field’s next generation of leaders. All are new post-docs. Each has already made a tangible contribution and earned recognition from their colleagues. All, in short, are among the most recent winners of the four big LHC collaborations’ thesis prizes. Each was offered carte blanche to write about a subject of their choosing, which they believe will be strategically crucial to the future of the field. Almost all responded. These are their viewpoints.

Invest in accelerator innovation

Nicole Hartman

I come from Dallas, Texas, so the Superconducting Super Collider should have been in my backyard as I was growing up. By the late 1990s, its 87 km ring could have delivered 20 TeV per proton beam. The Future Circular Collider could deliver 50 TeV per proton beam in a 91 km ring by the 2070s. I’d be retired before first collisions. Clearly, we need an intermediate-term project to keep expertise in our community. Among the options proposed so far, I’m most excited by linear electron–positron colliders, as they would offer sufficient energy to study the Higgs self-coupling via di-Higgs production. This could be decisive in understanding electroweak symmetry breaking and unveiling possible Higgs portals.

A paradigm shift for accelerators might achieve our physics goals without a collider’s cost scaling with its energy. A strong investment in collider R&D could therefore offer hope for my generation of scientists to push back the energy frontier. Muon colliders avoid synchrotron radiation. Plasma wakefields offer a 100-fold increase in electric field gradient. Though both represent enormous challenges, psychologists have noted an “end of history” phenomenon, whereby as humans we appreciate how much we have changed in the past, but under­estimate how much we will change in the future. Reflecting on the past physics breakthroughs galvanises me to optimism: unlocking the next chapter of physics has always been within the reach of technological innovation. CERN has been a mecca for accelerator applications in the last 70 years. I’d argue that a strong increase in support for novel collider R&D is the best way to carry this legacy forwards.

Nicole Hartman is a post-doc at the Technical University of Munich and Origins Data Science Lab. She was awarded a PhD by Stanford University for her thesis “A search for non-resonant HH  4b at s = 13 TeV with the ATLAS detector – or – 2b, and then another 2b… now that’s the thesis question”.

Reward technical work with career opportunities

Alessandro Scarabotto

This job is a passion and a privilege, and ECRs devote nights and weekends to our research. But this energy should be handled in a more productive way. In particular, technical work on hardware and software is not valued and rewarded as it should be. ECRs who focus on technical aspects are often forced to divide their focus with theoretical work and data analysis, or suffer reduced opportunities to pursue an academic career. Is this correct? Why shouldn’t technical and scientific work be valued in the same way?

I am very hopeful for the future. In recent years, I have seen improvements in this direction, with many supervisors increasingly pushing their students towards technical work. I expect senior leadership to make organisational adjustments to reward and value these two aspects of research in exactly the same way. This cultural shift would greatly benefit our physics community by more efficiently transforming the enthusiasm and hard work of ECRs into skilled contributions to the field that are sustained over the decades.

Alessandro Scarabotto is a postdoctoral researcher at Technische Universität Dortmund. He was awarded a PhD by Sorbonne Université, Paris, for his thesis “Search for rare four-body charm decays with electrons in the final state and long track reconstruction for the LHCb trigger”.

A revolving door to industry

Christopher Brown

Big companies’ energy usage is currently skyrocketing to fuel their artificial intelligence (AI) systems. There is a clear business adaptation of my research on fast, energy-saving AI triggers, but I feel completely unable to make this happen. Why, as a field, are we unable to transfer our research to industry in an effective way?

While there are obvious milestones for taking data to publication, there is no equivalent for starting a business or getting our research into major industry players. Our collaborations are incubators for ideas and people. They should implement dedicated strategies to help ECRs obtain the funding, professional connections and business skills they need to get their ideas into the wider world. We should be presenting at industry conferences – both to offer solutions to industry and to obtain them for our own research – and industry sessions within our own conferences could bring links to every part of our field.

Most importantly, the field should encourage a revolving door between academia and industry to optimise the transfer of knowledge and skills. Unfortunately, when physicists leave for industry, slow, single-track physics career progressions and our focus on publication count rather than skills make a return unrealistic. There also needs to be a way of attracting talent from industry into physics without the requirement of a PhD so that experienced people can start or return to research in high-profile positions suitable for their level of work and life experience.

Christopher Brown is a CERN fellow working on next-generation triggers. He was awarded a PhD by Imperial College London for his thesis “Fast machine learning in the CMS Level-1 trigger for the High-Luminosity LHC”.

Collaboration, retention and support

Prajita Bhattarai

I feel a strong sense of agency regarding the future of our field. The upcoming High-Luminosity LHC (HL-LHC) will provide a wealth of data beyond what the LHC has offered, and we should be extremely excited about the increased discovery potential. Looking further ahead, I share the vision of a future Higgs factory as the next logical step for the field. The proposed Future Circular Collider is currently the most feasible option. However, the high cost and evolving geopolitical landscape are causes for concern. One of the greatest challenges we face is retaining talent and expertise. In the US, it has become increasingly difficult for researchers to find permanent positions after completing postdocs, leading to a loss of valuable technical and operational expertise. On a positive note, our field has made significant strides in providing opportunities for students from under­represented nationalities and socioeconomic backgrounds – I am a beneficiary of these efforts. Still, I believe we should intensify our focus on supporting individuals as they transition through different career stages to ensure a vibrant and diverse future workforce.

Prajita Bhattarai is a research associate at SLAC National Accelerator Laboratory in the US. She was awarded her PhD by Brandeis University in the US for her thesis “Standard Model electroweak precision measurements with two Z bosons and two jets in ATLAS”.

Redesign collaborations for equitable opportunity

Spandan Mondal

Particle physics and cosmology capture the attention of nearly every inquisitive child. Though large collaborations and expensive machines have produced some of humankind’s most spectacular achievements, they have also made the field inaccessible to many young students. Making a meaningful contribution is contingent upon being associated with an institution or university that is a member of an experimental collaboration. One typically also has to study in a country that has a cooperation agreement with an international organisation like CERN.

If future experiments want to attract diverse talent, they should consider new collaborative models that allow participation irrespective of a person’s institution or country of origin. Scientific and financial responsibilities could be defined based on expertise and the research grants of individual research groups. Remote operations centres across the globe, such as those trialled by CERN experiments, could enable participants to fulfil their responsibilities without being constrained by international borders and travel budgets; the worldwide revolution in connectivity infrastructure could provide an opportunity to make this the norm rather than the exception. These measures could provide equitable opportunities to everyone while simultaneously maximising the scientific output of our field.

Spandan Mondal is a postdoctoral fellow at Brown University in the US. He was awarded a PhD by RWTH Aachen in Germany for his thesis on the CMS experiment “Charming decays of the Higgs, Z, and W bosons: development and deployment of a new calibration method for charm jet identification”.

Reward risk taking

Francesca Ercolessi

Young scientists often navigate complex career paths, where the pressure to produce consistent publishable results can stifle creativity and discourage risk taking. Traditionally, young researchers are evaluated almost solely on achieved results, often leading to a culture of risk aversion. To foster a culture of innovation we must shift our approach to research and evaluation. To encourage bold and innovative thinking among ECRs, the fuel of scientific progress, we need to broaden our definition of success. European funding and grants have made strides in recognising innovative ideas, but more is needed. Mentorship and peer-review systems must also evolve, creating an environment open to innovative thinking, with a calculated approach to risk, guided by experienced scientists. Concrete actions include establishing mentorship programmes during scientific events, such as workshops and conferences. To maximise the impact, these programmes should prioritise diversity among mentors and mentees, ensuring that a wide range of perspectives and experiences are shared. Equally important is recognising and rewarding innovation. This can be achieved by dedicated awards that value originality and potential impact over guaranteed success. Celebrating attempts, even failed ones, can shift the focus from the outcome to the process of discovery, inspiring a new generation of scientists to push the boundaries of knowledge.

Francesca Ercolessi is a post-doc at the University of Bologna. She was awarded a PhD by the University of Bologna for her thesis “The interplay of multiplicity and effective energy for (multi) strange hadron production in pp collisions at the LHC”.

Our employment model stifles creativity

Florian Jonas

ECR colleagues are deeply passionate about the science they do and wish to pursue a career in our field – “if possible”. Is there anything one can do to better support this new generation of physicists? In my opinion, we have to address the scarcity of permanent positions in our field. Short-term contracts lead to risk aversion, and short-term projects with a high chance of publication increase your employment prospects. This is in direct contrast to what is needed to successfully complete ambitious future projects this century – projects that require innovation and out-of-the-box thinking by bright young minds.

In addition, employment in fundamental science is more than ever in direct competition with permanent jobs in industry. For example, machine learning and computing experts innovate our field with novel analysis techniques, but end up ultimately leaving our field to apply their skills in permanent employment elsewhere. If we want to keep talent in our field we must create a funding structure that allows realistic prospects for long-term employment and commitment to future projects.

Florian Jonas is a postdoctoral scholar at UC Berkeley and LBNL. He was awarded a PhD by the University of Münster for his thesis on the ALICE experiment “Probing the initial state of heavy-ion collisions with isolated prompt photons”.

Embrace private expertise and investment

Jona Motta

The two great challenges of our time are data taking and data analysis. Rare processes like the production of Higgs-boson pairs have cross sections 10 orders of magnitude smaller than their backgrounds – and during HL-LHC operation the CMS trigger will have to analyse about 50 TB/s and take decisions with a latency of 12.5 μs. In recent years, we have made big steps forward with machine learning, but our techniques are not always up to speed with the current state-of-the-art in the private sector.

To sustain and accelerate our progress, the HEP community must be more open to new sources of funding, particularly from private investments. Collaborations with tech companies and private investors can provide not only financial support but also access to advanced technologies and expertise. Encouraging CERN–private partnerships can lead to the development of innovative tools and infrastructure, driving the field forward.

The recent establishment of the Next Generation Trigger Project, funded by the Eric and Wendy Schmidt Fund for Strategic Innovation, represents the first step toward this kind of collaboration. Thanks to overlapping R&D interests, this could be scaled up to direct partnerships with companies to introduce large and sustained streams of funds. This would not only push the boundaries of our knowledge but also inspire and support the next generation of physicists, opening new tenured positions thanks to private funding.

Jona Motta is a post-doc at Universität Zürich. He was awarded a PhD by Institut Polytechnique de Paris for his thesis “Development of machine learning based τ trigger algorithms and search for Higgs boson pair production in the bbττ decay channel with the CMS detector at the LHC”.

Stability would stop the brain drain

Hassnae El Jarrari

The proposed Future Circular Collider presents a formidable challenge. Every aspect of its design, construction, commissioning and operations would require extensive R&D to achieve the needed performance and stability, and fully exploit the machine’s potential. The vast experience acquired at the LHC will play a significant role. Knowledge must be preserved and transmitted between generations. But the loss of expertise is already a significant problem at the LHC.

The main reason for young scientists to leave the field is the lack of institutional support: it’s hard to count on a stable working environment, regardless of our expertise and performance. The difficulty in finding permanent academic or research positions and the lack of recognition and advancement are all viewed as serious obstacles to pursuing a career in HEP. In these conditions, a young physicist might find competitive sectors such as industry or finance more appealing given the highly stable future they offer.

It is crucial to address this problem now for the HL-LHC. Large HEP collaborations should be more supportive to ensure better recognition and career advancement towards permanent positions. This kind of policy could help to retain young physicists and ensure they continue to be involved in the current HEP projects that would then define the success of the FCC.

Hassnae El Jarrari is a CERN research fellow in experimental physics. She was awarded a PhD by Université Mohammed-V De Rabat for her thesis “Dark photon searches from Higgs boson and heavy boson decays using pp collisions recorded at s = 13 TeV with the ATLAS detector at the LHC and performance evaluation of the low gain avalanche detectors for the HL-LHC ATLAS high-granularity timing detector”.

Reduce environmental impacts

Luca Quaglia

The main challenge for the future of large-scale HEP experiments is reducing our environmental impact, and raising awareness is key to this. For example, before running a job, the ALICE computing grid provides an estimate of its CO2-equivalent carbon footprint, to encourage code optimisation and save power.

I believe that if we want to thrive in the future, we should adopt a new way of doing physics where we think critically about the environment. We should participate in more collaboration meetings and conferences remotely, and promote local conferences that are reachable by train.

I’m not saying that we should ban air travel tout court. It’s especially important for early-career scientists to get their name out there and to establish connections. But by attending just one major international conference in person every two years, and publicising alternative means of communication, we can save resources and travel time, which can be invested in our home institutions. This would also enable scientists from smaller groups with reduced travel budgets to attend more conferences and disseminate their findings.

Luca Quaglia is a postdoctoral fellow at the Istituto Nazionale di Fisica Nucleare, Sezione di Torino. He was awarded his PhD by the University of Torino for his thesis “Development of eco-friendly gas mixtures for resistive plate chambers”.

Invest in software and computing talent

Joshua Beirer

With both computing and human resources in short supply, funds must be invested wisely. While scaling up infrastructure is critical and often seems like the simplest remedy, the human factor is often overlooked. Innovative ideas and efficient software solutions require investment in training and the recruitment of skilled researchers.

This investment must start with a stronger integration of software education into physics degrees. As the boundaries between physics and computer science blur, universities must provide a solid foundation, raise awareness of the importance of software in HEP and physics in general, and promote best practices to equip the next generation for the challenges of the future. Continuous learning must be actively supported, and young researchers must be provided with sufficient resources and appropriate mentoring from experienced colleagues.

Software skills remain in high demand in industry, where financial incentives and better prospects often attract skilled people from academia. It is in the interest of the community to retain top talent by creating more attractive and secure career paths. After all, a continuous drain of talent and knowledge is detrimental to the field, hinders the development of efficient software and computing solutions, and is likely to prove more costly in the long run.

Joshua Beirer is a CERN research fellow in the offline software group of the ATLAS experiment and part of the lab’s strategic R&D programme on technologies for future experiments. He was awarded his PhD by the University of Göttingen for his thesis “Novel approaches to the fast simulation of the ATLAS calorimeter and performance studies of track-assisted reclustered jets for searches for resonant X  SH  bbWW* production with the ATLAS detector”.

Strengthen international science

Ezra D. Lesser

HEP is at an exciting yet critical inflection point. The coming years hold both unparalleled opportunities and growing challenges, including an expanding arena of international competition and the persistent issue of funding and resource allocation. In a swiftly evolving digital age, scientists must rededicate themselves to public service, engagement and education, informing diverse communities about the possible technological advancements of HEP research, and sharing with the world the excitement of discovering fundamental knowledge of the universe. Collaborations must be strengthened across international borders and political lines, pooling resources from multiple countries to traverse cultural gaps and open the doors of scientific diplomacy. With ever-increasing expenses and an uncertain political future, scientists must insist upon the importance of public research irrespective of any national agenda, and reinforce scientific veracity in a rapidly evolving world that is challenged by growing misinformation. Most importantly, the community must establish global priorities in a maturing age of precision, elevating not only new discoveries but the necessary scientific repetition to better understand what we discover.

The most difficult issues facing HEP research today are addressable and furthermore offer excellent opportunities to develop the scientific approach for the next several decades. By tackling these issues now, scientists can continue to focus on the mysteries of the universe, driving scientific and technological advancements for the betterment of all.

Ezra D. Lesser is a CERN research fellow working with the LHCb collaboration. He was awarded his PhD in physics by the University of California, Berkeley for his thesis: Measurements of jet substructure in pp and Pb–Pb collisions at sNN = 5.02 TeV with ALICE”.

Recognise R&D

Savannah Clawson

ECRs must drive the field’s direction by engaging in prospect studies for future experiments, but dedicating time to this essential work comes at the expense of analysing existing data – a trade-off that can jeopardise our careers. With most ECRs employed on precarious two-to-four year contracts, time spent on these studies can result in fewer high-profile publications, making it harder to secure our next academic position. Another important factor is the unprecedented timescales associated with many prospective futures. Those working on R&D today may never see the fruits of their labour.

Anxieties surrounding these issues are often misinterpreted as disengagement, but nothing could be further from the truth. In my experience, ECRs are passionate about research, bringing fresh perspectives and ideas that are crucial for advancing the field. However, we often struggle with institutional structures that fail to recognise the breadth of our contributions. By addressing longstanding issues surrounding attitudes toward work–life balance and long-term job stability – through measures such as establishing enforced minimum contract durations, as well as providing more transparent and diverse sets of criteria for transitioning to permanent positions – we can create a more supportive environment where HEP thrives, driven by the creativity and innovation of its next generation of leaders.

Savannah Clawson is a postdoctoral fellow at DESY Hamburg. She was awarded her PhD by the University of Manchester for her thesis “The light at the end of the tunnel gets weaker: observation and measurement of photon-induced W+W production at the ATLAS experiment”.

Steering the ship of member states

CERN turns 70 at the end of September. How would you sum up the contribution the laboratory has made to human culture over the past seven decades?

CERN’s experimental and theoretical research laid many of the building blocks of one of the most successful and impactful scientific theories in human history: the Standard Model of particle physics. Its contributions go beyond the best-known discoveries, such as of neutral currents and the seemingly fundamental W, Z and Higgs bosons, which have such far-reaching significance for our universe. I also wish to draw attention to the many dozens of new composite particles at the LHC and the incredibly high-precision agreement between theoretical calculation performed in quantum chromodynamics and the experimental results obtained at the LHC. These amazing discovering were made possible thanks to the many technological innovations made at CERN.

But knowledge creation and accumulation are only half the story. CERN’s human ecosystem is an oasis in which the words “collaboration among peoples for the good of humanity” can be uttered without grandstanding or hypocrisy.

What role does the CERN Council play?

CERN’s member states are each represented by two delegates to the CERN Council. Decisions are made democratically, with equal voting power for each national delegation. According to the convention approved in 1954, and last revised in 1971, Council determines scientific, technical and administrative policy, approves CERN’s programmes of activities, reviews its expenditures and approves the laboratory’s budget. The Director-General and her management team work closely with Council to develop the Organization’s policies, scientific activities and budget. Director-General Fabiola Gianotti and her management team are now collaborating with Council to forge CERN’s future scientific vision.

What’s your vision for CERN’s future?

As CERN Council president, I have a responsibility to be neutral and reflect the collective will of the member states. In early 2022, when I took up the presidency, Council delegates unanimously endorsed my evaluation of their vision: that CERN should continue to offer the world’s best experimental high-energy physics programme using the best technology possible. CERN now needs to successfully complete the High-Luminosity LHC (HL-LHC) project and agree on a future flagship project.

I strongly believe the format of the future flagship project needs to crystallise as soon as possible. As put to me recently in a letter from the ECFA early-career researchers panel: “While the HL-LHC constitutes a much-anticipated and necessary advance in the LHC programme, a clear path beyond it for our future in the field must be cemented with as little delay as possible.” It can be daunting for young people to speak out on strategy and the future of the field, given the career insecurities they face. I am very encouraged by their willingness to put out a statement calling for immediate action.

At its March 2024 session, Council agreed to ignite the process of selecting the next flagship project by going ahead with the fourth European Strategy for Particle Physics update. The strategy group are charged, among other things, with recommending what this flagship project should be to Council. As I laid down the gavel concluding the meeting I looked around and sensed genuine excitement in the Chambers – that of a passenger ship leaving port. Each passenger has their own vision for the future. Each is looking forward to seeing what the final destination will look like. Several big pieces had started falling into place, allowing us to turn on the engine.

What are these big pieces?

Acting upon the recommendation of the 2020 update of the European Strategy for Particle Physics, CERN in 2021 launched a technical and financial feasibility study for a Future Circular Collider (FCC) operating first as a Higgs, electroweak and top factory, with an eye to succeeding it with a high-energy proton–proton collider. The report will include the physics motivation, technological and geological feasibility, territorial implementation, financial aspects, and the environmental and sustainability challenges that are deeply important to CERN’s member states and the diverse communes of our host countries.

Fabiola Gianotti and Eliezer Rabinovici at CERN Council

It is also important to add that CERN has also invested, and continues to invest, in R&D for alternatives to FCC such as CLIC and the muon collider. CLIC is a mature design, developed over decades, which has already precipitated numerous impactful societal applications in industry and medicine; and to the best of my knowledge, at present no laboratory has invested as much as CERN in muon-collider R&D.

A mid-term report of FCC’s feasibility study was submitted to subordinate bodies to the CERN management mid-2023, and their resulting reports were presented to CERN’s finance and scientific-policy committees. Council received the outcomes with great appreciation for the work involved during an extraordinary session on 2 February, and looks forward to the completion of the feasibility study in March 2025. Timing the European strategy update to follow hot on its heels and use it as an input was the natural next step.

At the June Council session, we started dealing with the nitty gritty of the process. A secretariat for the European Strategy Group was established under the chairmanship of Karl Jakobs, and committees are being appointed. By January 2026 the Council could have at its disposal a large part of the knowledge needed to chart the future of the CERN vision.

How would you encourage early-career researchers (ECRs) to engage with the strategy process?

ECRs have a central role to play. One of the biggest challenges when attempting to build a major novel research infrastructure such as the proposed FCC – which I sometimes think of as a frontier circular collider – is to maintain high-quality expertise, enthusiasm and optimism for long periods in the face of what seem like insurmountable hurdles. Historically, the physicists who brought a new machine to fruition knew that they would get a chance to work on the data it produced or at least have a claim for credit for their efforts. This is not the case now. Success rests on the enthusiasm of those who are at the beginning of their careers today just as much as senior researchers. I hope ECRs will rise to the challenge and find ways to participate in the coming European Strategy Group-sponsored deliberations and become future leaders of the field. One way to engage is to participate in ECR-only strategy sessions like those held at the yearly FCC weeks. I’d also encourage other countries to join the UK in organising nationwide ECR-only forums for debating the future of the field, such as I initiated in Birmingham in 2022.

What’s the outlook for collaboration and competition between CERN and other regions on the future collider programme?

Over decades, CERN has managed to place itself as the leading example of true international scientific collaboration. For example, by far the largest national contingent of CERN users hails from the US. Estonia has completed the process of joining CERN as a new member state and Brazil has just become the first American associate member state. There is a global agreement among scientists in China, Europe, Japan and the US that the next collider should be an electron–positron Higgs factory, able to study the properties of the Higgs boson with high precision. I hope that – patiently, and step by step – ever more global integration will form.

Do member states receive a strong return on their investment in CERN?

Research suggests that fundamental exploration actively stimulates the economy, and more than pays for itself. Member states and associate member states have steadfastly supported CERN to the tune of CHF 53 billion (unadjusted for inflation) since 1954. They do this because their citizens take pride that their nation stands with fellow member states at the forefront of scientific excellence in the fundamental exploration of our universe. They also do this because they know that scientific excellence stimulates their economies through industrial innovation and the waves of highly skilled engineers, entrepreneurs and scientists who return home trained, inspired and better connected after interacting with CERN.

A bipartisan US report from 2005 called “Rising above the gathering storm” offered particular clarity, in my opinion. It asserted that investments in science and technology benefit the world’s economy, and it noted both the abruptness with which a lead in science and technology can be lost and the difficulty of recovering such a lead. One should not be shy to say that when CERN was established in 1954, it was part of a rather crowded third place in the field of experimental particle physics, with the Soviet Union and the United States at the fore. In 2024, CERN is the leader of the field – and with leadership comes a heavy responsibility to chart a path beneficial to a large community across the whole planet. As CERN Council president, I thank member states for their steadfast support and I applaud them for their economic and scientific foresight over the past seven decades. I hope it will persist long into the 21st century.

Is there a role for private funding for fundamental research?

In Europe, substantial private-sector support for knowledge creation and creativity dates back at least to the Medici. Though it is arguably less emphasised in our times, it plays an important role today in the US, the UK and Israel. Academic freedom is a sine qua non for worthwhile research. Within this limit, I don’t believe there is any serious controversy in Council on this matter. My sense is that Council fully supports the clear division between recognising generosity and keeping full academic and governance freedom.

What challenges has Council faced during your tenure as president?

In February 2022, the Russian Federation, an observer state, invaded Ukraine, which has been an associate member state since 2016. This was a situation with no precedent for Council. The shape of our decisions evolved for well over a year. Council members decided to cover from their own budgets the share of Ukraine’s contribution to CERN. Council also tried to address as much as possible the human issues resulting from the situation. It decided to suspend the observer status in the Council of the Russian Federation and the Joint Institute for Nuclear Research. Council also decided to not extend its International Collaboration Agreements with the Republic of Belarus and the Russian Federation. CERN departments also undertook initiatives to support the Ukrainian scientific community at CERN and in Ukraine.

A second major challenge was to mitigate the financial pressures being experienced around the world, such as inflation and rising costs for energy and materials. A package deal was agreed upon in Council that included significant contributions from the member states, a contribution from the CERN staff, and substantial savings from across CERN’s activities. So far, these measures seem to have addressed the issue.

I thank member states for their steadfast support and I applaud them for their economic and scientific foresight over the past seven decades

While these key challenges were tackled, management worked relentlessly on preparing an exhaustive FCC feasibility study, to ensure that CERN stays on course in developing its scientific and technological vision for the field of experimental high-energy physics.

The supportive reaction of Council to these challenges demonstrated its ability to stay on course during rough seas and strong side winds. This cohesion is very encouraging for me. Time and again, Council faced difficult decisions in recent years. Though convergence seemed difficult at first, thanks to a united will and the help of all Council members, a way forward emerged and decisions were taken. It’s important to bear in mind that no matter which flagship project CERN embarks on, it will be a project of another order of magnitude. Some of the methods that made the LHC such a success can continue to accompany us, some will need to evolve significantly, and some new ones will need to be created.

Has the ideal of Science for Peace been damaged?

Over the years CERN has developed the skills needed to construct bridges. CERN does not have much experience in dismantling bridges. This issue was very much on the mind of Council as it took its decisions.

Do you wish to make some unofficial personal remarks?

Thanks. Yes. I would like to mention several things I feel grateful for.

Nobody owes humanity a concise description of the laws of physics and the basic constituents of matter. I am grateful for being in an era where it seems possible, thanks to a large extent to the experiments performed at CERN. Scientists from innumerable countries, who can’t even form a consensus on the best 1970s rock band, have succeeded time and again to assemble the most sophisticated pieces of equipment, with each part built in a different country. And it works. I stand in awe in front of that.

The ecosystem of CERN, the experimental groups working at CERN and the CERN Council are how I dreamt as a child that the United Nations would work. The challenges facing humanity in the coming centuries are formidable. They require international collaboration among the best minds from all over the planet. CERN shows that this is possible. But it requires hard work to maintain this environment. Over the years serious challenges have presented themselves, and one should not take this situation for granted. We need to be vigilant to keep this precious space – the precious gift of CERN.

Look to the Higgs self-coupling

What are the microscopic origins of the Higgs boson? As long as we lack the short-wavelength probes needed to study its structure directly, our best tool to confront this question is to measure its interactions.

Let’s consider two with starkly contrasting experimental prospects. The coupling of the Higgs boson to two Z bosons (HZZ) has been measured with a precision of around 5%, increasing to around 1.3% by the end of High-Luminosity LHC (HL-LHC) operations. The Higgs boson’s self-coupling (HHH) has so far only been measured with a precision of the order of several hundred percent, improving to around the 50% level by the end of HL-LHC operations – though it’s now rumoured that this latter estimate may be too pessimistic.

Good motives

As HZZ can be measured much more precisely than HHH, is it the more promising window beyond the Standard Model (SM)? An agnostic might say that both measurements are equally valuable, while a “top down” theorist might seek to judge which theories are well motivated, and ask how they modify the two couplings. In supersymmetry and minimal composite Higgs models, for example, modifications to HZZ and HHH are typically of a similar magnitude. But “well motivated” is a slippery notion and I don’t entirely trust it.

Fortunately there is a happy compromise between these perspectives, using the tool of choice of the informed agnostic: effective field theory. It’s really the same physical principle as trying to look within an object when your microscope operates on wavelengths greater than its physical extent. Just as the microscopic structure of an atom is imprinted, at low energies, in its multipolar (dipole, quadrupole and so forth) interactions with photons, so too would the microscopic structure of the Higgs boson leave its trace in modifications to its SM interactions.

All possible coupling modifications from microscopic new physics can be captured by effective field theory and organised into classes of “UV-completion”. UV-completions are the concrete microscopic scenarios that could exist. (Here, ultraviolet light is a metaphor for the short-wavelength probes needed to study the Higgs boson’s microscopic origins in detail.) Scenarios with similar patterns are said to live in the same universality class. Families of universality classes can be identified from the bottom up. A powerful tool for this is naïve dimensional analysis (NDA).

Matthew McCullough

One particularly sharp arrow in the NDA quiver is ℏ counting, which establishes how many couplings and/or ℏs must be present in the EFT modification of an interaction. Couplings tell you the number of fundamental interactions involved. ℏs establish the need for quantum effects. For instance, NDA tells us that the coefficient of the Fermi interaction must have two couplings, which the electroweak theory duly supplies – a W boson transforms a neutron into a proton, and then decays into an electron and a neutrino.

For our purposes, NDA tells us that modifications to HZZ must necessarily involve one more ℏ or two fewer couplings than any underlying EFT interaction that modifies HHH. In the case of one more ℏ, modifications to HZZ could potentially be an entire quantum loop factor smaller than modifications to HHH. In the case of two fewer couplings, modifications to HHH could be as large as a factor g2 greater than for HZZ, where g is a generic coupling. Either way, it is theoretically possible that the BSM modifications could be up to a couple of orders of magnitude greater for HHH than for HZZ. (Naively, a loop factor counts as around 1/16 π2 or about 0.01, and in the most strongly interacting scenarios, g2 can rise to about 16 π2.)

Why does this contrast so strongly with supersymmetry and the minimal composite Higgs? They are simply in universality classes where modifications to HZZ and HHH are comparable in magnitude. But there are more universality classes in heaven and Earth than are dreamt of in our well-motivated scenarios.

Faced with the theoretical possibility of a large hierarchy in coupling modifications, it behoves the effective theorist to provide an existence proof of a concrete UV-completion where this happens, or we may have revealed a universality class of measure zero. But such an example exists: the custodial quadruplet model. I often say it’s a model that only a mother could love, but it could exist in nature, and gives rise to coupling modifications a full loop factor of about 200 greater for HHH than HZZ.

When confronted with theories beyond the SM, all Higgs couplings are not born equal: UV-completions matter. Though HZZ measurements are arguably the most powerful general probe, future measurements of HHH will explore new territory that is inaccessible to other coupling measurements. This territory is largely uncharted, exotic and beyond the best guesses of theorists. Not bad circumstances for the start of any adventure.

Electroweak SUSY after LHC Run 2

ATLAS figure 1

Supersymmetry (SUSY) provides elegant solutions to many of the problems of the Standard Model (SM) by introducing new boson/fermion partners for each SM fermion/boson, and by extending the Higgs sector. If SUSY is realised in nature at the TeV scale, it would accommodate a light Higgs boson without excessive fine-tuning. It could furthermore provide a viable dark-matter candidate, and be a key ingredient to the unification of the electroweak and strong forces at high energy. The SUSY partners of the SM bosons can mix to form what are called charginos and neutralinos, collectively referred to as electroweakinos.

Electroweakinos would be produced only through the electroweak interaction, where their production cross sections in proton–proton collisions are orders of magnitude smaller than strongly produced squarks and gluinos (the supersymmetric partners of quarks and gluons). Therefore, while extensive searches using the Run 1 (7–8 TeV) and Run 2 (13 TeV) LHC datasets have turned up null results, the corresponding chargino/neutralino exclusion limits remain substantially weaker than those for strongly interacting SUSY particles.

The ATLAS collaboration has recently released a comprehensive analysis of the electroweak SUSY landscape based on its Run 2 searches. Each individual search targeted specific chargino/neutralino production mechanisms and subsequent decay modes. The analyses were originally interpreted in so-called “simplified models”, where only one production mechanism is considered, and only one possible decay. However, if SUSY is realised in nature, its particles will have many possible production and decay modes, with rates depending on the SUSY parameters. The new ATLAS analysis brings these pieces together by reinterpreting 10 searches in the phenomenological Minimal Supersymmetric Standard Model (pMSSM), which includes a range of SUSY particles, production mechanisms and decay modes governed by 19 SUSY parameters. The results provide a global picture of ATLAS’s sensitivity to electroweak SUSY and, importantly, reveals the gaps that remain to be explored.

ATLAS figure 2

The 19-dimensional pMSSM parameter space was randomly sampled to produce a set of 20,000 SUSY model points. The 10 selected ATLAS searches were then performed on each model point to determine whether it is excluded with at least 95% confidence level. This involved simulating datasets for each SUSY model, and re-running the corresponding analyses and statistical fits. An extensive suite of reinterpretation tools was employed to achieve this, including preserved likelihoods and RECAST – a framework for preserving analysis workflows and re-applying them to new signal models.

The results show that, while electro­weakino masses have been excluded up to 1 TeV in simplified models, the coverage with regard to the pMSSM is not exhaustive. Numerous scenarios remain viable, including mass regions nominally covered by previous searches (inside the dashed line in figure 1). The pMSSM models may evade detection due to smaller production cross-sections and decay probabilities compared to simplified models. Scenarios with small mass-splittings between the lightest and next-to-lightest neutralino can reproduce the dark-matter relic density, but are particularly elusive at the LHC. The decays in these models produce challenging event features with low-momentum particles that are difficult to reconstruct and separate from SM events.

Beyond ATLAS, experiments such as LZ aim at detecting relic dark-matter particles through their scattering by target nuclei. This provides a complementary probe to ATLAS searches for dark matter produced in the LHC collisions. Figure 2 shows the LZ sensitivity to the pMSSM models considered by ATLAS, compared to the sensitivity of its SUSY searches. ATLAS is particularly sensitive to the region where the dark-matter candidate is around half the Z/Higgs-boson mass, causing enhanced dark-matter annihilation that could have reduced the otherwise overabundant dark-matter relic density to the observed value.

The new ATLAS results demonstrate the breadth and depth of its search programme for supersymmetry, while uncovering its gaps. Supersymmetry may still be hiding in the data, and several scenarios have been identified that will be targeted, benefiting from the incoming Run 3 data.

Back to the future

The past seven decades have seen remarkable cultural and technological changes. And CERN has been no passive observer. From modelling European cooperation in the aftermath of World War II to democratising information via the web and discovering a field that pervades the universe, CERN has nudged the zeitgeist more than once since its foundation in 1954.

It’s undeniable, though, that much has stayed the same. A high-energy physics lab still needs to be fast, cool, collaborative, precise, practically useful, deep, diplomatic, creative and crystal clear. Plus ça change, plus c’est la même chose.

This selection of (lightly colourised) snapshots from CERN’s first 25 years, accompanied by expert reflections from across the lab, show how things have changed in the intervening years – and what has stayed the same.

1960

A 5 m diameter magnetic storage ring in 1960

The discovery that electrons and muons possess spin that precesses in a magnetic field has inspired generations of experimentalists and theorists to push the boundaries of precision. The key insight is that quantum effects modify the magnetic moment associated with the particles’ spins, making their gyromagnetic ratios (g) slightly larger than two, the value predicted by Dirac’s equation. For electrons, these quantum effects are primarily due to the electromagnetic force. For muons, the weak and strong forces also contribute measurably – as well, perhaps, as unknown forces. These measurements stand with the most beautiful and precise of all time, and their history is deeply intertwined with that of the Standard Model.

CERN physicists Francis Farley and Emilio Picasso were pioneers and driving forces behind the muon g–2 experimental programme. The second CERN experiment introduced the use of a 5 m diameter magnetic storage ring. Positive muons with 1.3 GeV momentum travelled around the ring until they decayed into positrons whose directions were correlated with the spin of the parent muons. The experiment tested the muon’s anomalous magnetic moment (g-2) with a precision of 270 parts per million. A brilliant concept, the “magic gamma”, was then introduced in the third CERN experiment in the late 1970s: by using muons at a momentum of 3.1 GeV, the effect of electric fields on the precession frequency cancelled out, eliminating a major source of systematic error. All subsequent experiments have relied on this principle, with the exception of an experiment using ultra-cold muons that is currently under construction in Japan. A friendly rivalry for precision between experimentalists and theorists continues today (Lattice calculations start to clarify muon g-2), with the latest measurement at Fermilab achieving a precision of 190 parts per billion.

Andreas Hoecker is spokesperson for the ATLAS collaboration.

1961

Inspecting bubble-chamber images by hand

The excitement of discovering new fundamental particles and forces made the 1950s and 1960s a golden era for particle physicists. A lot of creative energy was channelled into making new particle detectors, such as the liquid hydrogen (or heavier liquid) bubble chambers that paved the way to discoveries such as neutral currents, and seminal studies of neutrinos and strange and charmed baryons. As particles pass through, they make the liquid boil, producing bubbles that are captured to form images. In 1961, each had to be painstakingly inspected by hand, as depicted here, to determine the properties of each particle. Fortunately, in the decades since, physicists have found ways to preserve the level of detail they offer and build on this inspiration to prepare new technologies. Liquid–argon time-projection chambers such as CERN’s DUNE prototypes, which are currently the largest of their kind in the world, effectively give us access to bubble-chamber images in full colour, with the colour representing energy deposition (CERN Courier July/August 2024 p41). Millions of these images are now analysed algorithmically – essential, as DUNE is expected to generate one of the highest data rates in the world.

Laura Munteanu is a CERN staff scientist working on the T2K and DUNE experiments.

1965

The first experiment at CERN to use a superconducting magnet, in 1965

This photograph shows the first experiment at CERN to use a superconducting magnet. The pictured physicist is adjusting a cryostat containing a stack of nuclear emulsions surrounded by a liquid–helium-cooled superconducting niobium–zirconium electromagnet. A pion beam from CERN’s synchro­cyclotron passes through the quadrupole magnet at the right, collimated by the pile of lead bricks and detected by a small plastic scintillation counter before entering the cryostat. In this study of double charge exchange from π+ to π in nuclear emulsions, the experiment consumed between one and two litres of liquid helium per hour from the container in the left foreground, with the vapour being collected for reuse (CERN Courier August 1965 p116).

Today, the LHC is the world’s largest scientific instrument, with more than 24 km of the machine operating at 1.9 K – and yet only one project among many at CERN requiring advanced cryogenics. As presented at the latest international cryogenic engineering conference organised here in July, there have never been so many cryogenics projects either implemented or foreseen. They include accelerators for basic research, light sources, medical accelerators, detectors, energy production and transmission, trains, planes, rockets and ships. The need for energy efficiency and long-term sustainability will necessitate cryogenic technology with an enlarged temperature range for decades to come. CERN’s experience provides a solid foundation for a new generation of engineers to contribute to society.

Serge Claudet is a former deputy group leader of CERN’s cryogenics group.

1966

Mirrors at CERN used to reflect Cherenkov light

Polishing a mirror at CERN in 1966. Are physicists that narcissistic? Perhaps some are, but not in this case. Ultra-polished mirrors are still a crucial part of a class of particle detectors based on the Cherenkov effect. Just as a shock wave of sound is created when an object flies through the sky at a speed greater than the speed of sound in air, so charged particles create a shock wave of light when they pass through a medium at a speed greater than the speed of light in that medium. This effect is extremely useful for measuring the velocity of a charged particle, because the emission angle of light packets relative to the trajectory of the particle is related to the velocity of the particle itself. By measuring the emission angle of Cherenkov light for an ultra-relativistic charged particle travelling through a transparent medium, such as a gas, the velocity of the particle can be determined. Together with the measurement of the particle’s momentum, it is then possible to obtain its identity card, i.e. its mass. Mirrors are used to reflect Cherenkov light to the photosensors. The LHCb experiment at CERN has the most advanced Cherenkov detector ever built. Years go by and technology evolves, but fundamental physics is about reality, and that’s unchangeable!

Vincenzo Vagnoni is spokesperson of the LHCb collaboration.

1970

The Intersecting Storage Rings

In 1911, Heinke Kamerlingh Onnes made a groundbreaking discovery by measuring zero resistance in a mercury wire at 4.2 K, revealing the phenomenon of superconductivity. This earned him the 1913 Nobel Prize, decades in advance of Bardeen, Cooper and Schrieffer’s full theoretical explanation of 1957. It wasn’t until the 1960s that the first superconducting magnets exceeding 1 T were built. This delay stemmed from the difficulty in enabling bulk superconductors to carry large currents in strong magnetic fields – a challenge requiring significant research.

The world’s first proton–proton collider, CERN’s pioneering Intersecting Storage Rings (ISR, pictured below left), began operation in 1971, a year after this photograph was taken. One of its characteristic “X”-shaped vacuum chambers is visible, flanked by combined-function bending magnets on either side. In 1980, to boost its luminosity, eight superconducting quadrupole magnets based on niobium-titanium alloy were installed, each with a 173 mm bore and a peak field of 5.8 T, making the ISR the first collider to use superconducting magnets. Today, we continue to advance superconductivity. For the LHC’s high-luminosity upgrade, we are preparing to install the first magnets based on niobium-tin technology: 24 quadrupoles with a 150 mm aperture and a peak field of 11.3 T.

Susana Izquierdo Bermudez leads CERN’s Large Magnet Facility.

1972

Mary Gaillard and Murray Gell-Mann

The Theoretical Physics Department, or Theory Division as it used to be known, dates back to the foundation of CERN, when it was first established in Copenhagen under the direction of Niels Bohr, before moving to Geneva in 1957. Theory flourished at CERN in the 1960s, hosting many scientists from CERN’s member states and beyond, working side-by-side with experimentalists with a particular focus on strong interactions.

In 1972, when Murray Gell-Mann visited CERN and had this discussion with Mary Gaillard, the world of particle physics was at a turning point. The quark model had been proposed by Gell-Mann in 1964 (similar ideas had been proposed by George Zweig and André Peterman) and the first experimental evidence of their reality had been discovered in deep-inelastic electron scattering at SLAC in 1968. However, the dynamics of quarks was a puzzle. The weak interactions being discussed by Gaillard and Gell-Mann in this picture were also puzzling, though Gerard ’t Hooft and Martinus Veltman had just shown that the unified theory of weak and electromagnetic interactions proposed earlier by Shelly Glashow, Abdus Salam and Steven Weinberg was a calculable theory.

The first evidence for this theory came in 1973 with the discovery of neutral currents by the Gargamelle neutrino experiment at CERN, and 1974 brought the discovery of
the charm quark, a key ingredient in what came to be known as the Standard Model. This quark had been postulated to explain properties of K mesons, whose decays are being discussed by Gaillard and Gell-Mann in this picture, and Gaillard, together with Benjamin Lee, went on to play a key role in predicting its properties. The discoveries of neutral currents and charm ushered in the Standard Model, and CERN theorists were active in exploring its implications – notably in sketching out the phenomenology of the Brout–Englert–Higgs mechanism. We worked with experimentalists particularly closely during the 1990s, making precise calculations and interpreting the results emerging from LEP that established the Standard Model.

CERN Theory in the 21st century has largely been focused on the LHC experimental programme and pursuing new ideas for physics beyond the Standard Model, often in relation to cosmology and astrophysics. These are likely to be the principal themes of theoretical research at CERN during its eighth decade.

John Ellis served as head of CERN’s Theoretical Physics Department from 1988 to 1994.

1974

Adjusting the electronics of the ion source

From 1959 to 1992, Linac1 accelerated protons to 50 MeV, for injection into the Proton Synchrotron, and from 1972 into the Proton Synchrotron Booster. In 1974, their journey started in this ion source. High voltage was used to achieve the first acceleration to a few percent of the speed of light. It wasn’t only the source itself that had to be at high voltage, but also the power supplies that feed magnets, the controllers for gas injection, the diagnostics and the controls. This platform was the laboratory for the ion source. When operational, the cubicle and everything in it was at 520 kV, meaning all external surfaces had to be smooth to avoid sparks. As pictured, hydraulic jacks could lift the lid to allow access for maintenance and testing, at which point a drawbridge would be lowered from the adjacent wall to allow the engineers and technicians to take a seat in front of the instruments.

Thanks to the invention of radio-frequency quadrupoles by Kapchinsky and Teplyakov, radio-frequency acceleration can now start from lower proton energies. Today, ion sources use much lower voltages, in the range of tens of kilovolts, allowing the source installations to shrink dramatically in size compared to the 1970s.

Richard Scrivens is CERN’s deputy head of accelerator and beam physics.

1974

The Super Proton Synchrotron tunnel

CERN’s labyrinth of tunnels has been almost contin­uously expanding since the lab was founded 70 years ago. When CERN was first conceived, who would have thought that the 7 km-long Super Proton Synchrotron tunnel shown in this photograph would have been constructed, let alone the 27 km LEP/LHC tunnel? Similar questions were raised about the feasibility of the LEP tunnel to those that are being posed today about the proposed Future Circular Collider (FCC) tunnel. But if you take a step back and look at the history of CERN’s expanding tunnel network, it seems like the next logical step for the organisation.

This vintage SPS photograph from the 1970s shows the tunnel’s secondary lining being constructed. The concrete was transported from the surface down the 50 m-deep shafts and then pumped behind the metal formwork to create the tunnel walls. This technology is still used today, most recently for the HL-LHC tunnels. However, for a mega-project like the FCC, a much quicker and more sophisticated methodology is envisaged. The tunnels would be excavated using tunnel boring machines, which will install a pre-cast concrete segmental lining using robotics immediately after the excavation of the rock, allowing 20 m of tunnel to be excavated and lined with concrete per day.

John Osborne is a senior civil engineer at CERN.

1977

Alan Jeavons and David Townsend

Detector development for fundamental physics always advances in symbiosis with detector development for societal applications. Here, Alan Jeavons (left) and David Townsend prepare the first positron-emission tomography (PET) scan of a mouse to be performed at CERN. A pair of high-density avalanche chambers (HIDACs) can be seen above and below Jeavons’ left hand. As in PET scans in hospitals today, a radioactive isotope introduced into the biological tissue of the mouse decays by emitting a positron that travels a few millimetres before annihilating with an electron. The resulting pair of coincident and back-to-back 511 keV photons was then converted into electron avalanches which were reconstructed in multiwire proportional chambers – a technology invented by CERN physicist Georges Charpak less than a decade earlier to improve upon bubble chambers and cloud chambers in high-energy physics experiments. The HIDAC detector later contributed to the development of three-dimensional PET image reconstruction. Such testing now takes place at dedicated pre-clinical facilities.

Today, PET detectors are based on inorganic scintillating crystals coupled to photodetectors – a technology that is also used in the CMS and ALICE experiments at the LHC. CERN’s Crystal Clear collaboration has been continuously developing this technology since 1991, yielding benefits for both fundamental physics and medicine.

One of the current challenges in PET is to improve time resolution in time-of-flight PET (TOF-PET) below 100 ps, and towards 10 ps. This will eventually enable positron annihilations to be pinpointed at the millimetre level, improving image quality, speeding up scans and reducing the dose injected into patients. Improvements in time resolution are also important for detectors in future high-energy experiments, and the future barrel timing layer of the CMS detector upgrade for the High-Luminosity LHC was inspired by TOF-PET R&D.

Etiennette Auffray Hillemanns is spokesperson for the Crystal Clear collaboration and technical coordinator for the CMS electromagnetic calorimeter.

1979

Rafel Carreras

In this photo, we see Rafel Carreras, a remarkable science educator and communicator, sharing his passion for science with an eager audience of young learners. Known for his creativity and enthusiasm, Carreras makes the complex world of particle physics accessible and fun. His particle-physics textbook When Energy Becomes Matter includes memorable visualisations that we still use in our education activities today. One such visualisation is the “fruity strawberry collision”, wherein two strawberries collide and transform into a multitude of new fruits, illustrating how particle collisions produce a shower of new particles that didn’t exist before.

Today, we find fewer chalk boards at CERN and more casual clothing, but one thing remains the same: CERN’s dedication to education and communication. Over the years, CERN has trained more than 10,000 science teachers, significantly impacting science education globally. CERN Science Gateway, our new education and outreach centre, allows us to welcome about 400,000 visitors annually. It offers a wide range of activities, such as interactive exhibitions, science shows, guided tours and hands-on lab experiences, making science exciting and accessible for everyone. Thanks to hundreds of passionate and motivated guides, visitors leave inspired and curious to find out more about the fascinating scientific endeavours and extraordinary technologies at CERN.

Julia Woithe coordinates educational activities at CERN’s new Science Gateway.

  • These photographs are part of a collection curated by Renilde Vanden Broeck, which will be exhibited at CERN in September.

Watch out for hybrid pixels

In 1885, in a darkened lab in Würzburg, Bavaria, Wilhelm Röntgen noticed that a screen coated with barium platinocyanide fluoresced, despite being shielded from the electron beam of his cathode-ray tube. Hitherto undiscovered “X”-rays were being emitted as the electrons braked sharply in the tube’s anode and glass casing. A week later, Röntgen imaged his wife’s hand using a photographic plate, and medicine was changed forever. X-rays would be used for non-invasive diagnosis and treatment, and would inspire countless innovations in medical imaging. Röntgen declined to patent the discovery of X-ray imaging, believing that scientific advancements should benefit all of humanity, and donated the proceeds of the first Nobel Prize for Physics to his university.

One hundred years later, medical imaging would once again be disrupted – not in a darkened lab in Bavaria, but in the heart of the Large Hadron Collider (LHC) at CERN. The innovation in question is the hybrid pixel detector, which allows remarkably clean track reconstruction. When the technology is adapted for use in a medical context, by modifying the electronics at the pixel level, X-rays can be individually detected and their energy measured, leading to spectroscopic X-ray images that distinguish between different materials in the body. In this way, black and white medical imaging is being reinvented in full colour, allowing more precise diagnoses with lower radiation doses.

The next step is to exploit precise timing in each pixel. The benefits will be broadly felt. Electron microscopy of biological samples can be clearer and more detailed. Biomolecules can be more precisely identified and quantified by imaging time-of-flight mass spectrometry. Radiation doses can be better controlled in hadron therapy, reducing damage to healthy tissue. Ultra-fast changes can be captured in detail at synchrotron light sources. Hybrid pixel detectors with fast time readout are even being used to monitor quantum-mechanical processes.

Digital-camera drawbacks

X-ray imaging has come a long way since the photographic plate. Most often, the electronics work in the same way as a cell-phone camera. A scintillating material converts X-rays into visible photons that are detected by light-sensitive diodes connected to charge-integrating electronics. The charge from high-energy and low-energy photons is simply added up within the pixel in the same way a photographic film is darkened by X-rays.

A hybrid pixel detector and Medipix3 chip

Charge integration is the technique of choice in the flat-panel detectors used in radiology as large surfaces can be covered relatively cheaply, but there are several drawbacks. It’s difficult to collect the scintillation light from an X-ray on a single pixel, as it spreads out. And information about the energy of the X-rays is lost.

By the 1990s, however, LHC detector R&D was driving the development of the hybrid pixel detector, which could solve both problems by detecting individual photons. It soon became clear that “photon counting” could be as useful in a hospital ward as it would prove to be in a high-energy-physics particle detector. In 1997 the Medipix collaboration first paired semiconductor sensors with readout chips capable of counting individual X-rays.

Nearly three decades later, hybrid pixel detectors are making their mark in hospital wards. Parallel to the meticulous process of preparing a technology for medical applications in partnership with industry, researchers have continued to push the limits of the technology, in pursuit of new innovations and applications.

Photon counting

In a hybrid pixel detector, semiconductor sensor pixels are individually fixed to readout chips by an array of bump bonds – tiny balls of solder that permit the charge signal in each sensor pixel to be passed to each readout pixel (see “Hybrid pixels” figure). In these detectors, low-noise pulse-processing electronics take advantage of the intrinsic properties of semiconductors to provide clean track reconstruction even at high rates (see “Semiconductor subtlety” panel).

Since silicon detectors are relatively transparent to the X-ray energies used in medical imaging (approximately 20 to 140 keV), denser sensor materials with higher stopping power are required to capture every photon passing through the patient. This is where hybrid pixel detectors really come into their own. For X-ray photons with an energy above about 20 keV, a highly absorbing material such as cadmium telluride can be used in place of the silicon used in the LHC experiments. Provided precautions are taken to deal with charge sharing between pixels, the number of X-rays in every energy bin can be recorded, allowing each pixel to measure the spectrum of the interacting X-rays.

Semiconductor subtlety

In insulators, the conduction band is far above the energy of electrons in the valence band, making it difficult for current to flow. In conductors, the two bands overlap and current flows with little resistance. In semiconductors, the gap is a just a couple of electron-volts. Passing charged particles, such as those created in the LHC experiments, promotes thousands of valence electrons into the conduction band, creating positively charged “holes” in the valence band, allowing current to flow.

Hybrid pixel detector

Silicon has four valence electrons and therefore forms four covalent bonds with neighbouring atoms to fill up its outermost shell in silicon crystals. These crystals can be doped with impurities that either add additional electrons to the conduction band (n-type doping) or additional holes to the valence band (p-type doping). The silicon pixel sensors used at the LHC are made up of rectangular pixels doped with additional holes on one side coupled to a single large electrode doped with additional electrons on the rear (see “Pixel picture” figure).

In p-n junctions such as these, “depletion zones” develop at the pixel boundaries, where neighbouring electrons and holes recombine, generating a natural electric field. The depletion zones can be extended throughout the whole sensor by applying a strong “reverse-bias” field in the opposite direction. When a charged particle passes, electrons and holes are created as before, but thanks to the field a directed pulse of charge now flows across the bump bond into the readout chip. Charge collection is prompt, permitting the pixel to be ready for the next particle.

In each readout pixel the detected charge pulse is compared with an externally adjustable threshold. If the pulse exceeds the threshold, its amplitude and timing can be measured. The threshold level is typically set to be many times higher than the electronic noise of the detection circuit, permitting noise-free images. Because of the intimate contact between the sensor and the readout circuit, the noise is typically less than a root-mean-square value of 100 electrons, and any signal higher than a threshold of about 500 electrons can be unambiguously detected. Pixels that are not hit remain silent.

In the LHC, each passing particle liberates thousands of electrons, allowing clean images of the collisions to be taken even at very high rates. Hybrid pixels have therefore become the detector of choice in many large experiments where fast and clean images are needed, and are the heart of the ATLAS, CMS and LHCb experiments. In cases where the event rates are lower, such as the ALICE experiment at the LHC and the Belle II experiment at SuperKEKB at KEK in Japan, it has now become possible to use “monolithic” active pixel detectors, where the sensor and readout electronics are implemented in the same substrate. In the future, as the semiconductor industry shifts to three-dimensional chip and wafer stacking, the distinction between hybrid and monolithic pixel detectors will be blurred.

Protocols regarding the treatment of patients are strictly regulated in the interest of safety, making it challenging to introduce new technologies. Therefore, in parallel with the development of successive generations of Medipix readout chips, a workshop series on the medical applications of spectroscopic X-ray detectors has been hosted at CERN. Now in its seventh edition (see “Threshold moment for medical photon counting”), the workshop gathers representatives of cross-disciplinary specialists ranging from the designers of readout chips to specialists in the large equipment suppliers, and from medical physicists all the way up to opinion-leading radiologists. The role of the workshop is the formation and development of a community of practitioners from diverse fields willing to share knowledge – and, of course, reasonable doubts – in order to encourage the transition of spectroscopic photon counting from the lab to the clinic. CERN and the Medipix collaborations have played a pathfinding role in this community, exploring avenues well in advance of their introduction to medical practice.

The Medipix2 (1999–present), Medipix3 (2005–present) and Medipix4 (2016–present) collaborations are composed only of publicly funded research institutes and universities, which helps keep the development programmes driven by science. There have been hundreds of peer-reviewed publications and dozens of PhD theses written by the designers and users of the various chips. With the help of CERN’s Knowledge Transfer Office, several start-up companies have been created and commercial licences signed. This has led to many unforeseen applications and helped enormously with the dissemination of the technology. The publications of the clients of the industrial partners now represent a large share of the scientific outcome from these efforts, totalling hundreds of papers.

Spectroscopic X-ray imaging is now arriving in clinical practice. Siemens Healthineers were first to market in 2022 with the Naeotom Alpha photon counting CT scanner, and many of the first users have been making ground-breaking studies exploiting the newly available spectroscopic information in the clinical domain. CERN’s Medipix3 chip is at the heart of the MARS Bioimaging scanner, which brings unprecedented imaging performance to the point of patient care, opening up new patient pathways and saving time and money.

ASIC (application-specific integrated circuit) development is still moving forwards rapidly in the Medipix collaborations. For example, in the Medipix3 and Medipix4 chips, on-pixel circuitry mitigates the impact of X-ray fluorescence  and charge diffusion in the semiconductor by summing up the charge in a localised region and allocating the hit to one pixel. The fine segmentation of the detector not only leads to unprecedented spatial resolution but also mitigates “hole trapping” – a common bugbear of the high-density sensor materials used in medical imaging, whereby photons of the same energy induce different charges according to their interaction depth in the sensor. Where the pixel size is significantly smaller than the perpendicular sensor thickness – as in the Medipix case – only one of the charge species (usually electrons) contributes to the measured charge, and no matter where the X-ray is deposited in the sensor thickness, the total charge detected is the same.

But photon counting is only half the story. Another parameter that has not yet been exploited in high-spatial-resolution medical imaging systems can also be measured at the pixel level.

A new dimension

In 2005, Dutch physicists working with gas detectors requested a modification that would permit each pixel to measure arrival times instead of counting photons. The Medipix2 collaboration agreed and designed a chip with three acquisition modes: photon counting, arrival time and time over threshold, which provides a measure of energy. The Timepix family of pixel-detector readout chips was born.

Xènia Turró using a Timepix-based thumb-drive detector

The most recent generations of Timepix chips, such as Timepix3 (released in 2016) and Timepix4 (released in 2022) stream hit information off chip as soon as it is generated – a significant departure from Medipix chips, which process hits locally, assuming them to be photons, sending only a spectroscopic image off chip. With Timepix, each time a charge exceeds the threshold, a packet of information is sent off chip that contains the coordinates of the hit pixel, the particle’s arrival time and the time over threshold (66 bits in total per hit). This allows offline reconstruction of individual clusters of hits, opening up a myriad of potential new applications.

One advantage of Timepix is that particle event reconstruction is not limited to photons. Cosmic muons leave a straight track. Low-energy X-rays interact in a point-like fashion, lighting up only a small number of pixels. Electrons interact with atomic electrons in the sensor material, leaving a curly track. Alpha particles deposit a large quantity of charge in a characteristic blob. To spark the imagination of young people, Timepix chips have been incorporated on a USB thumb drive that can be read out on a laptop computer (see “Thumb-drive detector” figure). The CERN & Society Foundation is raising funds to make these devices widely available in schools.

Timepix chips have also been adapted to dose monitoring for astronauts. Following a calibration effort by the University of Houston, NASA and the Institute for Experimental and Applied Physics in Prague, a USB device identical to that used in classrooms precisely measures the doses experienced by flight crews in space. Timepix is now deployed on the International Space Station (see “Radiation monitoring” figure), the Artemis programme and several European space-weather studies, and will be deployed on the Lunar Gateway programme.

Stimulating innovation

Applications in science, industry and medicine are too numerous to mention in detail. In time-of-flight mass spectrometry, the vast number of channels allowed by Timepix promises new insights into biomolecules. Large-area time-resolved X-ray cameras are valuable at synchrotrons, where they have applications in structural biology, materials science, chemistry and environmental science. In the aerospace, manufacturing and construction industries, non-destructive X-ray testing using back­scattering can probe the integrity of materials and structures while requiring access from one side only. Timepix chips also play a crucial role in X-ray diffraction for materials analysis and medical applications such as single-photon-emission computed tomography (SPECT), and beam tracking and dose-deposition moni­toring in hadron therapy (see “Carbon therapy” figure). The introduction of noise-free hit streaming with timestamp precision down to 200 picoseconds has also opened up entirely new possibilities in quantum science, and early applications of Timepix3 in experiments exploring the quantum behaviour of particles are already being reported. We are just beginning to uncover the potential of these innovations.

Chris Cassidy working near the Timepix USB

It’s also important to note that applications of the Timepix chips are not limited to the readout of semiconductor pixels made of silicon or cadmium telluride. A defining feature of hybrid pixel detectors is that the same readout chip can be used with a variety of sensor materials and structures. In cases where visible photons are to be detected, an electron can be generated in a photocathode and then amplified using a micro-channel plate. The charge cloud from the micro-channel plate is then detected on a bare readout chip in much the same way as the charge cloud in a semiconductor sensor. Some gas-filled detectors are constructed using gas electron multipliers and micromegas foils, which amplify charge passing through holes in the foils. Timepix chips can be used for readout in place of the conventional pad arrays, providing much higher spatial and time resolution than would otherwise be available.

Successive generations of Timepix and Medipix chips have followed Moore’s law, permitting more and more circuitry to be fitted into a single pixel as the minimum feature size of transistors has shrunk. In the Timepix3 and Timepix4 chips, data-driven architecture and on-pixel time stamping are the unique features. The digital circuitry of the pixel has become so complex that an entirely new approach to chip design – “digital-on-top” – was employed. These techniques were subsequently deployed in ASIC developments for the LHC upgrades.

Just as hybrid-pixel R&D at the LHC has benefitted societal applications, R&D for these applications now benefits fundamental research. Making highly optimised chips available to industry “off the shelf” can also save substantial time and effort in many applications in fundamental research, and the highly integrated R&D model whereby detector designers keep one foot in both camps generates creativity and the reciprocal sparking of ideas and sharing of expertise. Timepix3 is used as readout of the beam–gas-interaction monitors at CERN’s Proton Synchrotron and Super Proton Synchrotron, providing non-destructive images of the beams in real time for the first time. The chips are also deployed in the ATLAS and MoEDAL experiments at the LHC, and in numerous small-scale experiments, and Timepix3 know-how helped develop the VeloPix chip used in the upgraded tracking system for the LHCb experiment. Timepix4 R&D is now being applied to the development of a new generation of readout chips for future use at CERN, in applications where a time bin of 50 ps or less is desired.

Maria Martišíková and Laurent Kelleter

All these developments have relied on collaborating research organisations being willing to pool the resources needed to take strides into unexplored territory. The effort has been based on the solid technical and administrative infrastructure provided by CERN’s experimental physics department and its knowledge transfer, finance and procurement groups, and many applications have been made possible by hardware provided by the innovative companies that license the Medipix and Timepix chips.

With each new generation of chips, we have pushed the boundaries of what is possible by taking calculated risks ahead of industry. But the high-energy-physics community is under intense pressure, with overstretched resources. Can blue-sky R&D such as this be justified? We believe, in the spirit of Röntgen before us, that we have a duty to make our advancements available to a larger community than our own. Experience shows that when we collaborate across scientific disciplines and with the best in industry, the fruits lead directly back into advancements in our own community.

Antiprotons cooled in record time

To test the most fundamental symmetry of the Standard Model, CPT symmetry, which implies exact equality between the fundamental properties of particles and their antimatter conjugates, antimatter particles must be cooled to the lowest possible temperatures. The BASE experiment, located at CERN, has passed a major milestone in this regard. Using a sophisticated system of Penning traps, the collaboration has reduced the time required to cool an antiproton by a factor of more than 100. The considerable improvement makes it possible to measure the antiproton’s properties with unparalleled precision, perhaps shedding light on the mystery of why matter outnumbers antimatter in the universe.

Magnetic moments

BASE (Baryon Antibaryon Symmetry Experiment) specialises in the study of antiprotons by measuring properties such as the magnetic moment and charge-to-mass ratio. The latter quantity has been shown to agree with that of the proton within an experimental uncertainty of 16 parts per trillion. While not nearly as precise due to much higher complexity, measurements of the antiproton’s magnetic moment provide an equally important probe of CPT symmetry.

To determine the antiproton’s magnetic moment, BASE measures the frequency of spin flips of single antiprotons – a remarkable feat that requires the particle to be cooled to less than 200 mK. BASE’s previous setup could achieve this, but only after 15 hours of cooling, explains lead author Barbara Latacz (RIKEN/CERN): “As we need to perform 1000 measurement cycles, it would have taken us three years of non-stop measurements, which would have been unrealistic. By reducing the cooling time to eight minutes, BASE can now obtain all of the 1000 measurements it needs – and thereby improve its precision – in less than a month.” By cooling antiprotons to such low energies, the collaboration has been able to detect antiproton spin transitions with an error rate (< 0.000023) more than three orders of magnitude better than in previous experiments.

Underpinning the BASE breakthrough is an improved cooling trap. BASE takes antiprotons that have been decelerated by the Antiproton Decelerator and the Extra Low Energy Antiproton ring (ELENA) and stores them in batches of around 100 in a Penning trap, which holds them in place using electric and magnetic fields. A single antiproton is then extracted into a system made up of two Penning traps: the first trap measures its temperature and, if it is too high, transfers the antiproton to a second trap to be cooled further. The particle goes back and forth between the two traps until the desired temperature is reached.

The new cooling trap has a diameter of just 3.8 mm, less than half the size of that used in previous experiments, and is equipped with innovative segmented electrodes to reduce the amplitude of one of the antiproton oscillations – the cyclotron mode – more effectively. The readout electronics have also been optimised to reduce background noise. The new system reduces the time spent by the antiproton in the cooling trap during each cycle from 10 minutes to 5 seconds, while improvements to the measurement trap have also made it possible to reduce the measurement time fourfold.

“Up to now, we have been able to compare the magnetic moments of the antiproton and the proton with a precision of one part per billion,” says BASE spokesperson Stefan Ulmer (Max Planck–RIKEN–PTB). “Our new device will allow us to reach a precision of a tenth of a billion and, on the very long-term, will even allow us to perform experiments with 10 parts-per-trillion resolution. The slightest discrepancy could help solve the mystery of the imbalance between matter and antimatter in the universe.”

A pevatron at the galactic centre

Best-fit HAWC spectral energy distribution

The measured all-particle energy spectrum for cosmic rays (CRs) is famously described by a steeply falling power law. The spectrum is almost featureless from energies of around 30 GeV to 3 PeV, where a break (also known as the “knee”) is encountered, after which the spectrum becomes steeper. It is believed that CRs with energies below the knee have galactic origins. This is supported by the observation of diffuse gamma rays from the galactic disk in the GeV range (a predominant mechanism for the production of gamma rays is via the decay of neutral pions created when relativistic protons interact with the ambient gas). The knee could be explained by either the maximum energy that galactic sources can accelerate CR particles to, or the escape of CR particles from the galaxy if they are energetic enough to overcome the confinement of galactic magnetic fields. Both scenarios, however, assume the presence of astrophysical sources within the galaxy that could accelerate CR particles up to PeV energies. For decades, scientists have therefore been on the hunt for such sources, reasonably called “pevatrons”.

Recently, researchers at the High-Altitude Water Cherenkov (HAWC) observatory in Mexico reported the observation of ultra-high energy (> 100 TeV) gamma rays from the central region of the galaxy. Using nearly seven years of data, the team found that a point source, HAWC J1746-2856, with a simple power-law spectrum and no signs of a cutoff from 6 to 114 TeV best describes the observed gamma-ray flux. A total of 98 events were observed at energies above 100 TeV.

To analyse the spatial distribution of the observed gamma rays, the researchers plotted a significance map of the galactic centre. On this map, they also plotted the point-like supernova remnant SNR G0.9+0.1 and an unidentified extended source HESS J1745-303, both located 1° away from the galactic centre. While supernova remnants have long been a favoured candidate for galactic pevatrons, HAWC did not observe any excess at either of these source positions. There are, however, two other interesting point sources in this region: Sgr A* (HESS J1745-290), the supermassive black hole in the galactic centre; and HESS J1746-285, an unidentified source that is spatially coincident with the galactic radio arc. Imaging atmospheric Cherenkov telescopes such as HESS, VERITAS and MAGIC have measured the gamma-ray emissions from these sources up to an energy of about 20 TeV, but HAWC has an angular resolution about six times larger at such energies and therefore cannot resolve them.

To eliminate the contamination to the flux from these sources, the authors assumed that their spectra cover the full HAWC energy range and then estimated the event count by convolving the reported best-fit model from HESS with the instrument-response functions of HAWC. The resulting HAWC spectral energy distribution, after subtracting these sources (see figure), seems to be compatible with the diffuse emission data points from HESS while still maintaining a power-law behaviour, with no signs of a cutoff and extending up to at least 114 TeV. This is the first detection of gamma rays at energies > 100 TeV from the galactic centre, thereby providing convincing evidence of the presence of a pevatron.

This is the first detection of gamma rays at energies > 100 TeV from the galactic centre

Furthermore, the diffuse emission is spatially correlated with the morphology of the central molecular zone (CMZ) – a region in the innermost 500 pc of the galaxy consisting of enormous molecular clouds corresponding to around 60 million solar masses. Such a correlation supports a hadronic scenario for the origin of cosmic rays, where gamma rays are produced via the interaction of relativistic protons with the ambient gas. In the leptonic scenario, electrons with energies above 100 TeV produce gamma rays via inverse Compton scattering, but such electrons suffer severe radiative losses; for a magnetic field strength of 100 μG, the maximum distance that such electrons can traverse is much smaller than the CMZ. On the other hand, in the hadronic case the escape time for protons is orders of magnitude shorter than the cooling time (via π0 decay). The stronger magnetic field could confine them for a longer period but, as the authors argue, the escape time is also much smaller than the age of the galaxy, thereby pointing to a young source that is quasi-continuously injecting and accelerating protons into the CMZ.

The study also computes the energy density of cosmic-ray protons with energies above 100 TeV to be 8.1 × 10–3eV/cm3. This is higher than the 1 × 10–3eV/cm3 local measurement from the Alpha Magnetic Spectrometer in 2015, indicating the presence of newly accelerated protons in the energy range 0.1–1 PeV. The capabilities of this study did not extend to the identification of the source, but with better modelling of the CMZ in the future, and improved performances of upcoming observatories such as CTAO and SWGO, candidate sites in the galactic centre are expected to be probed with much higher resolution.

Two charming results of data parking

CMS figure 1

The high data rate at the LHC creates challenges as well as opportunities. Great care is required to identify interesting events, as only a tiny fraction can trigger the detector’s readout. With the LHC achieving record-breaking instantaneous luminosity, the CMS collaboration has innovated to protect and expand its flavour-physics programme, which studies rare decays and subtle differences between particles containing beauty and charm quarks. Enhancements in the CMS data-taking strategy such as “data parking” have enabled the detector to surpass its initial performance limits. This has led to notable advances in charm physics, including CMS’s first analysis of CP violation in the charm sector and achieving world-leading sensitivity to the rare decay of the D0 meson into a pair of muons.

Data parking stores subsets of unprocessed data that cannot be processed promptly due to computing limitations. By parking events triggered by a single muon, CMS collected an inclusive sample of approximately 10 billion b-hadrons in 2018. This sample allowed CMS to reconstruct D0 and D0 decays into a pair of long-lived K0s mesons, which are relatively easy to detect in the CMS detector despite the high level of pileup and the large number of low-momentum tracks.

CP violation is necessary to explain the matter–antimatter asymmetry observed in the universe, but the magnitude of CP violation from known sources is insufficient. Charmed meson decays are the only meson decays involving an up-type quark where CP violation can be studied. CP violation would be evident if the decay rates for D0 K0s K0s and D0 K0s K0s were found to differ. In the analysis, the flavour of the initial D0 or D0  meson is determined from the charge of the pion accompanying its creation in the decay of a D*+ meson (see figure 1). To eliminate systematic effects arising from the charge asymmetry in production and detector response, the CP asymmetry is measured relative to that in D0 K0s π+π. The resulting asymmetry is found to be ACP(KSKS) = 6.2% ± 3.0% (stat) ± 0.2% (syst) ± 0.8% (PDG), consistent with no CP violation within 2.0 standard deviations. Previous analyses by LHCb and Belle were consistent with no CP violation within 2.7 and 1.8 standard deviations, respectively. Before data parking, searching for direct CP violation in the charm sector with a fully hadronic final state was deemed unattainable for CMS.

The CMS collaboration has expanded its flavour-physics programme

For Run 3 the programme was enhanced by introducing an inclusive dimuon trigger covering the low mass range up to 8.5 GeV. With improvements in the CMS Tier-0 prompt reconstruction workflow, Run-3 parking data is now reconstructed without delay using the former Run-2 high-level trigger farm at LHC Point 5 and European Tier-1 resources. In 2024 CMS is collecting data at rates seven times higher than the nominal rates for Run 2, already reaching approximately 70% of the nominal trigger rate for the HL-LHC.

Using the data collected in 2022 and 2023, CMS performed a search for the rare D0-meson decay into a pair of muons, which was presented at the ICHEP conference in Prague. Rare decays of the charm quark, less explored compared to those of the bottom quark, offer an opportunity to probe new physics effects beyond the direct reach of current colliders, thanks to possible quantum interference by unknown heavy virtual particles. In 2023, the LHCb collaboration set an upper limit for the branching ratio at 3.5 × 10–9 at a 95% confidence using Run-2 data. CMS surpassed the LHCb result, achieving a sensitivity of 2.6 × 10–9 at a 95% confidence. Given that the Standard Model prediction is four orders of magnitude smaller, there is still considerable territory to explore.

Beginning with the 2024 run, the CMS flavour-physics programme will gain an additional data stream known as data scouting. This stream captures at very high-rate events triggered by new high-purity single muon level-one triggers in a reduced format. This format is suitable for reconstructing decays of heavy hadrons, offering performance comparable to standard data processing.

Lattice calculations start to clarify muon g-2

In 1974, Kenneth G Wilson suggested modelling the continuous spacetime of quantum chromodynamics (QCD) with a discrete lattice – space and time would be represented as a grid of points, with quarks on the lattice points and gluons on the links between them. Lattice QCD has only grown in importance since, with international symposia on lattice field theory taking place annually since 1984. Since then the conference has developed and by now furnishes an important forum for both established experts and early-career researchers alike to report recent progress, and the published proceedings provide a valuable resource. The 41st symposium, Lattice 2024, welcomed 500 participants to the University of Liverpool from 28 July to 3 August.

Hadronic contributions

One of the highest profile topics in lattice QCD is the evaluation of hadronic contributions to the magnetic moment of the muon. For many years, the experimental measurements from Brookhaven and Fermilab have appeared to be in tension with the Standard Model (SM), based on theoretical predictions that rely on data from e+e annihilation to hadrons. Intense work on the lattice by multiple groups is now maturing rapidly and providing a valuable cross-check for data-driven SM calculations.

At the lowest order in quantum electrodynamics, the Dirac equation accounts for precisely two Bohr magnetons in the muon’s magnetic moment (g = 2) – a contribution arising purely from the muon interacting with a single real external photon representing the magnetic field. At higher orders in QED, virtual Standard Model particles modify that value, leading to a so-called anomalous magnetic moment g–2. The Schwinger term adds a virtual photon and a contribution to g-2 of approximately 0.2%. Adding individual virtual W, Z or Higgs bosons adds a well defined contribution a factor of a million or so smaller. The remaining relevant contributions are from hadronic vacuum polarisation (HVP) and hadronic light-by-light (HLBL) scattering. HVP and HLBL both add hadronic contributions integrated to all orders in the strong coupling constant to interactions between the muon and the external electric field, which also feature additional virtual photons. Though their contributions to g-2 are in the ballpark of the small electroweak contribution, they are more difficult to calculate, and dominate the error budget for the SM prediction of the muon’s g-2.

Christine Davies (University of Glasgow) gave a comprehensive survey of muon g–2 that stressed several high-level points: the small HLBL contribution looks to be settled, and is unlikely to be a key piece to the puzzle; recent tensions among the e+e experiments for HVP have emerged and need to be better understood; and in the most contentious region, all eight recent lattice–QCD calculations agree with each other and with the very recent e+e hadrons experiment CMD 3 (2024 Phys. Rev. Lett. 132 231903), though not so much with earlier experiments. Thus, lattice QCD and CMD 3 suggest there is “almost certainly less new physics in muon g–2 than previously hoped, and perhaps none,” said Davies. We shall see: many groups are preparing results for the full HVP, targeting a new whitepaper from the Muon g–2 Theory Initiative by the end of this year, in anticipation of the final measurement from the Fermilab experiment sometime in 2025.

New directions

While the main focus of Lattice calculations is the study of QCD, lattice methods have been applied beyond that. There is a small but active community investigating systems that could be relevant to physics beyond the Standard Model, including composite Higgs models, supersymmetry and dark matter. These studies often inspire formal “theoretical” developments that are of interest beyond the lattice community. Particularly exciting directions this year were the development on emergent phases, non-invertible symmetries and their possible application to formulate chiral gauge theories, one of the outstanding theoretical issues in lattice gauge theories.

The lattice QCD community is one of the main users of high-performance computing resources

The lattice QCD community is one of the main users of high-performance computing resources, with its simulation efforts generating petabytes of Monte Carlo data. For more than 20 years, a community wide effort, the international lattice data grid (ILDG), has allowed this data to be shared. Since its inception, ILDG implemented the FAIR principles – data should be findable, accessible, interoperable and reusable – almost fully. The lattice QCD community is now discussing Open Science. Ed Bennett (Swansea) led a panel discussion that explored the benefits of ILDG embracing open science, such as higher credibility for published results, and not least the means to fulfill the expectations of funding bodies. Sustainably maintaining the infrastructure and employing the personnel required calls for national or even international community efforts to convince the funding agencies to provide corresponding funding lines, but also the researchers of the benefits of open science.

The Kenneth G. Wilson Award for Excellence in Lattice Field Theory was awarded to Michael Wagman (Fermi­lab) for his lattice-QCD studies of noise reduction in nuclear systems, the structure of nuclei and transverse-momentum-dependent hadronic structure functions. Fifty years on from Wilson’s seminal paper, two of the field’s earliest contributors, John Kogut (US Department of Energy) and Jan Smit (University of Amsterdam), reminisced about the birth of the lattice in a special session chaired by Liverpool pioneer Chris Michael. Both speakers gave fascinating insights into a time where physics was extracted from a handful of small-volume gauge configurations, compared to hundreds of thousands today.

Lattice 2025 will take place at the Tata Institute of Fundamental Research in Mumbai, India, from 3 to 8 November 2025.

bright-rec iop pub iop-science physcis connect