Comsol -leaderboard other pages

Topics

Charting DESY’s future

How would you describe DESY’s scientific culture?

DESY is a large laboratory with just over 3000 employees. It was founded 65 years ago as an accelerator lab, and at its heart it remains one, though what we do with the accelerators has evolved over time. It is fully funded by Germany.

In particle physics, DESY has performed many important studies, for example to understand the charm quark following the November Revolution of 1974. The gluon was discovered here in the late 1970s. In the 1980s, DESY ran the first experiments to study B mesons, laying the groundwork for core programmes such as LHCb at CERN and the Belle II experiment in Japan. In the 1990s, the HERA accelerator focused on probing the structure of the proton, which, incidentally, was the subject of my PhD, and those results have been crucial for precision studies of the Higgs boson.

Over time, DESY has become much more than an accelerator and particle-physics lab. Even in the early days, it used what is called synchrotron radiation, the light emitted when electrons change direction in the accelerator. This light is incredibly useful for studying matter in detail. Today, our accelerators are used primarily for this purpose: they generate X-rays that image tiny structures, for example viruses.

DESY’s culture is shaped by its very engaged and loyal workforce. People often call themselves “DESYians” and strongly identify with the laboratory. At its heart, DESY is really an engineering lab. You need an amazing engineering workforce to be able to construct and operate these accelerators.

Which of DESY’s scientific achievements are you most proud of?

The discovery of the gluon is, of course, an incredible achievement, but actually I would say that DESY’s greatest accomplishment has been building so many cutting-edge accelerators: delivering them on time, within budget, and getting them to work as intended.

Take the PETRA accelerator, for example – an entirely new concept when it was first proposed in the 1970s. The decision to build it was made in 1975; construction was completed by 1978; and by 1979 the gluon was discovered. So in just four years, we went from approving a 2.3 km accelerator to making a fundamental discovery, something that is absolutely crucial to our understanding of the universe. That’s something I’m extremely proud of.

I’m also very proud of the European X-ray Free-Electron Laser (XFEL), completed in 2017 and now fully operational. Before that, in 2005 we launched the world’s first free-electron laser, FLASH, and of course in the 1990s HERA, another pioneering machine. Again and again, DESY has succeeded in building large, novel and highly valuable accelerators that have pushed the boundaries of science.

What can we look forward to during your time as chair?

We are currently working on 10 major projects in the next three years alone! PETRA III will be running until the end of 2029, but our goal is to move forward with PETRA IV, the world’s most advanced X-ray source. Securing funding for that first, and then building it, is one of my main objectives. In Germany, there’s a roadmap process, and by July this year we’ll know whether an independent committee has judged PETRA IV to be one of the highest-priority science projects in the country. If all goes well, we aim to begin operating PETRA IV in 2032.

Our FLASH soft X-ray facility is also being upgraded to improve beam quality, and we plan to relaunch it in early September. That will allow us to serve more users and deliver better beam quality, increasing its impact.

In parallel, we’re contributing significantly to the HL-LHC upgrade. More than 100 people at DESY are working on building trackers for the ATLAS and CMS detectors, and parts of the forward calorimeter of CMS. That work needs to be completed by 2028.

Hunting axions

Astroparticle physics is another growing area for us. Over the next three years we’re completing telescopes for the Cherenkov Telescope Array and building detectors for the IceCube upgrade. For the first time, DESY is also constructing a space camera for the satellite UltraSat, which is expected to launch within the next three years.

At the Hamburg site, DESY is diving further into axion research. We’re currently running the ALPS II experiment, which has a fascinating “light shining through a wall” setup. Normally, of course, light can’t pass through something like a thick concrete wall. But in ALPS II, light inside a magnet can convert into an axion, a hypothetical dark-matter particle that can travel through matter almost unhindered. On the other side, another magnet converts the axion back into light. So, it appears as if the light has passed through the wall, when in fact it was briefly an axion. We started the experiment last year. As with most experiments, we began carefully, because not everything works at once, but two more major upgrades are planned in the next two years, and that’s when we expect ALPS II to reach its full scientific potential.

We’re also developing additional axion experiments. One of them, in collaboration with CERN, is called BabyIAXO. It’s designed to look for axions from the Sun, where you have both light and magnetic fields. We hope to start construction before the end of the decade.

Finally, DESY also has a strong and diverse theory group. Their work spans many areas, and it’s exciting to see what ideas will emerge from them over the coming years.

How does DESY collaborate with industry to deliver benefits to society?

We already collaborate quite a lot with industry. The beamlines at PETRA, in particular, are of strong interest. For example, BioNTech conducted some of its research for the COVID-19 vaccine here. We also have a close relationship with the Fraunhofer Society in Germany, which focuses on translating basic research into industrial applications. They famously developed the MP3 format, for instance. Our collaboration with them is quite structured, and there have also been several spinoffs and start-ups based on technology developed at DESY. Looking ahead, we want to significantly strengthen our ties with industry through PETRA IV. With much higher data rates and improved beam quality, it will be far easier to obtain results quickly. Our goal is for 10% of PETRA IV’s capacity to be dedicated to industrial use. Furthermore, we are developing a strong ecosystem for innovation on the campus and the surrounding area, with DESY in the centre, called the Science City Hamburg Bahrenfeld.

What’s your position on “dual use” research, which could have military applications?

The discussion around dual-use research is complicated. Personally, I find the term “dual use” a bit odd – almost any high-tech equipment can be used for both civilian and military purposes. Take a transistor for example, which has countless applications, including military ones, but it wasn’t invented for that reason. At DESY, we’re currently having an internal discussion about whether to engage in projects that relate to defence. This is part of an ongoing process where we’re trying to define under what conditions, if any, DESY would take on targeted projects related to defence. There are a range of views within DESY, and I think that diversity of opinion is valuable. Some people are firmly against this idea, and I respect that. Honestly, it’s probably how I would have felt 10 or 20 years ago. But others believe DESY should play a role. Personally, I’m open to it.

If our expertise can help people defend themselves and our freedom in Europe, that’s something worth considering. Of course, I would love to live in a world without weapons, where no one attacks anyone. But if I were attacked, I’d want to be able to defend myself. I prefer to work on shields, not swords, like in Asterix and Obelix, but, of course, it’s never that simple. That’s why we’re taking time with this. It’s a complex and multifaceted issue, and we’re engaging with experts from peace and security research, as well as the social sciences, to help us understand all dimensions. I’ve already learned far more about this than I ever expected to. We hope to come to a decision on this later this year.

You are DESY’s first female chair. What barriers do you think still exist for women in physics, and how can institutions like DESY address them?

There are two main barriers, I think. The first is that, in my opinion, society at large still discourages girls from going into maths and science.

Certainly in Germany, if you stopped a hundred people on the street, I think most of them would still say that girls aren’t naturally good at maths and science. Of course, there are always exceptions: you do find great teachers and supportive parents who go against this narrative. I wouldn’t be here today if I hadn’t received that kind of encouragement.

That’s why it’s so important to actively counter those messages. Girls need encouragement from an early age, they need to be strengthened and supported. On the encouragement side, DESY is quite active. We run many outreach activities for schoolchildren, including a dedicated school lab. Every year, more than 13,000 school pupils visit our campus. We also take part in Germany’s “Zukunftstag”, where girls are encouraged to explore careers traditionally considered male-dominated, and boys do the same for fields seen as female-dominated.

Looking ahead, we want to significantly strengthen our ties with industry

The second challenge comes later, at a different career stage, and it has to do with family responsibilities. Often, family work still falls more heavily on women than men in many partnerships. That imbalance can hold women back, particularly during the postdoc years, which tend to coincide with the time when many people are starting families. It’s a tough period, because you’re trying to advance your career.

Workplaces like DESY can play a role in making this easier. We offer good childcare options, flexibility with home–office arrangements, and even shared leadership positions, which help make it more manageable to balance work and family life. We also have mentoring programmes. One example is dynaMENT, where female PhD students and postdocs are mentored by more senior professionals. I’ve taken part in that myself, and I think it’s incredibly valuable.

Do you have any advice for early-career women physicists?

If I could offer one more piece of advice, it’s about building a strong professional network. That’s something I’ve found truly valuable. I’m fortunate to have a fantastic international network, both male and female colleagues, including many women in leadership positions. It’s so important to have people you can talk to, who understand your challenges, and who might be in similar situations. So if you’re a student, I’d really recommend investing in your network. That’s very important, I think.

What are your personal reflections on the next-generation colliders?

Our generation has a responsibility to understand the electroweak scale and the Higgs boson. These questions have been around for almost 90 years, since 1935 when Hideki Yukawa explored the idea that forces might be mediated by the exchange of massive particles. While we’ve made progress, a true understanding is still out of reach. That’s what the next generation of machines is aiming to tackle.

The problem, of course, is cost. All the proposed solutions are expensive, and it is very challenging to secure investments for such large-scale projects, even though the return on investment from big science is typically excellent: these projects drive innovation, build high-tech capability and create a highly skilled workforce.

Europe’s role is more vital than ever

From a scientific point of view, the FCC is the most comprehensive option. As a Higgs factory, it offers a broad and strong programme to analyse the Higgs and electroweak gauge bosons. But who knows if we’ll be able to afford it? And it’s not just about money. The timeline and the risks also matter. The FCC feasibility report was just published and is still under review by an expert committee. I’d rather not comment further until I’ve seen the full information. I’m part of the European Strategy Group and we’ll publish a new report by the end of the year. Until then, I want to understand all the details before forming an opinion.

It’s good to have other options too. The muon collider is not yet as technically ready as the FCC or linear collider, but it’s an exciting technology and could be the machine after next. Another could be using plasma-wakefield acceleration, which we’re very actively working on at DESY. It could enable us to build high-energy colliders on a much smaller scale. This is something we’ll need, as we can’t keep building ever-larger machines forever. Investing in accelerator R&D to develop these next-gen technologies is crucial.

Still, I really hope there will be an intermediate machine in the near future, a Higgs factory that lets us properly explore the Higgs boson. There are still many mysteries there. I like to compare it to an egg: you have to crack it open to see what’s inside. And that’s what we need to do with the Higgs.

One thing that is becoming clearer to me is the growing importance of Europe. With the current uncertainties in the US, which are already affecting health and climate research, we can’t assume fundamental research will remain unaffected. That’s why Europe’s role is more vital than ever.

I think we need to build more collaborations between European labs. Sharing expertise, especially through staff exchanges, could be particularly valuable in engineering, where we need a huge number of highly skilled professionals to deliver billion-euro projects. We’ve got one coming up ourselves, and the technical expertise for that will be critical.

I believe science has a key role to play in strengthening Europe, not just culturally, but economically too. It’s an area where we can and should come together.

Clean di-pions reveal vector mesons

LHCb figure 1

Heavy-ion collisions usually have very high multiplicities due to colour flow and multiple nucleon interactions. However, when the ions are separated by greater than about twice their radii in so-called ultra-peripheral collisions (UPC), electromagnetic-induced interactions dominate. In these colour-neutral interactions, the ions remain intact and a central system with few particles is produced whose summed transverse momenta, being the Fourier transform of the distance between the ions, is typically less than 100 MeV/c.

In the photoproduction of vector mesons, a photon, radiated from one of the ions, fluctuates into a virtual vector meson long before it reaches the target and then interacts with one or more nucleons in the other ion. The production of ρ mesons has been measured at the LHC by ALICE in PbPb and XeXe collisions, while J/ψ mesons have been measured in PbPb collisions by ALICE, CMS and LHCb. Now, LHCb has isolated a precisely measured, high-statistics sample of di-pions with backgrounds below 1% in which several vector mesons are seen.

Figure 1 shows the invariant mass distribution of the pions, and the fit to the data requires contributions from the ρ meson, continuum ππ, the ω meson and two higher mass resonances at about 1.35 and 1.80 GeV, consistent with excited ρ mesons. The higher structure was also discernible in previous measurements by STAR and ALICE. Since its discovery in 1961, the ρ meson has proved challenging to describe because of its broad width and because of interference effects. More data in the di-pion channel, particularly when practically background-free down almost to production threshold, are therefore welcome. These data may help with hadronic corrections to the prediction of muon g-2: the dip and bump structure at high masses seen by LHCb is qualitatively similar to that observed by BaBar in e+e → π+π scattering (CERN Courier March/April 2025 p21). From the invariant mass spectrum, LHCb has measured the cross-sections for ρ, ω, ρand ρ′′ as a function of rapidity in photoproduction on lead nuclei.

Naively, comparison of the photo­production on the nucleus and on the proton should simply scale with the number of nucleons, and can be calculated in the impulse approximation that only takes into account the nuclear form factor, neglecting all other potential nuclear effects.

However, nuclear shadowing, caused by multiple interactions as the meson passes through the nucleus, leads to a suppression (CERN Courier January/February 2025 p31). In addition, there may be further non-linear QCD effects at play.

Elastic re-scattering is usually described through a Glauber calculation that takes account of multiple elastic scatters. This is extended in the GKZ model using Gribov’s formalism to include inelastic scatters. The inset in figure 1 shows the measured differential cross-section for the ρ meson as a function of rapidity for LHCb data compared to the GKZ prediction, to a prediction for the STARlight generator, and to ALICE data at central rapidities. Additional suppression due to nuclear effects is observed above that predicted by GKZ.

European strategy update: the community speaks

Community input themes of the European Strategy process

The deadline for submitting inputs to the 2026 update of the European Strategy for Particle Physics (ESPP) passed on 31 March. A total of 263 submissions, ranging from individual to national perspectives, express the priorities of the high-energy physics community (see “Community inputs” figure). These inputs will be distilled by expert panels in preparation for an Open Symposium that will be held in Venice from 23 to 27 June (CERN Courier March/April 2025 p11).

Launched by the CERN Council in March 2024, the stated aim of the 2026 update to the ESPP is to develop a visionary and concrete plan that greatly advances human knowledge in fundamental physics, in particular through the realisation of the next flagship project at CERN. The community-wide process, which is due to submit recom­mendations to Council by the end of the year, is also expected to prioritise alternative options to be pursued if the preferred project turns out not to be feasible or competitive.

“We are heartened to see so many rich and varied contributions, in particular the national input and the various proposals for the next large-scale accelerator project at CERN,” says strategy secretary Karl Jakobs of the University of Freiburg, speaking on behalf of the European Strategy Group (ESG). “We thank everyone for their hard work and rigour.”

Two proposals for flagship colliders are at an advanced stage: a Future Circular Collider (FCC) and a Linear Collider Facility (LCF). As recommended in the 2020 strategy update, a feasibility study for the FCC was released on 31 March, describing a 91 km-circumference infrastructure that could host an electron–positron Higgs and electroweak factory followed by an energy-frontier hadron collider at a later stage. Inputs for an electron–positron LCF cover potential starting configurations based on Compact Linear Collider (CLIC) or International Linear Collider (ILC) technologies. It is proposed that the latter LCF could be upgraded using CLIC, Cool Copper Collider, plasma-wakefield or energy-recovery technologies and designs. Other proposals outline a muon collider and a possible plasma-wakefield collider, as well as potential “bridging” projects to a future flagship collider. Among the latter are LEP3 and LHeC, which would site an electron–positron and an electron–proton collider, respectively, in the existing LHC tunnel. For the LHeC, an additional energy-recovery linac would need to be added to CERN’s accelerator complex.

Future choices

In probing beyond the Standard Model and more deeply studying the Higgs boson and its electroweak domain, next-generation colliders will pick up where the High-Luminosity LHC (HL-LHC) leaves off. In a joint submission, the ATLAS and CMS collaborations presented physics projections which suggest that the HL-LHC will be able to: observe the H  µ+µ and H  Zγ decays of the Higgs boson; observe Standard Model di-Higgs production; and measure the Higgs’ trilinear self-coupling with a precision better than 30%. The joint document also highlights the need for further progress in high-precision theoretical calculations aligned with the demands of the HL-LHC and serves as important input to the discussion on the choice of a future collider at CERN.

Neutrinos and cosmic messengers, dark matter and the dark sector, strong interactions and flavour physics also attracted many inputs, allowing priorities in non-collider physics to complement collider programmes. Underpinning the community’s physics aspirations are numerous submissions in the categories of accelerator science and technology, detector instrumentation and computing. Progress in these technologies is vital for the realisation of a post-LHC collider, which was also reflected by the recommendation of the 2020 strategy update to define R&D roadmaps. The scientific and technical inputs will be reviewed by the Physics Preparatory Group (PPG), which will conduct comparative assessments of the scientific potential of various proposed projects against defined physics benchmarks.

We are heartened to see so many rich and varied contributions

Key to the ESPP 2026 update are 57 national and national-laboratory submissions, including some from outside Europe. Most identify the FCC as the preferred project to succeed the LHC. If the FCC is found to be unfeasible, many national communities propose that a linear collider at CERN should be pursued, while taking into account the global context: a 250 GeV linear collider may not be competitive if China decides to proceed with a Circular Electron Positron Collider at a comparable energy on the anticipated timescale, potentially motivating a higher energy electron–positron machine or a proton–proton collider instead.

Complex process

In its review, the ESG will take the physics reach of proposed colliders as well as other factors into account. This complex process will be undertaken by seven working groups, addressing: national inputs; diversity in European particle physics; project comparison; implementation of the strategy and deliverability of large projects; relations with other fields of physics; sustainability and environmental impact; public engagement, education, communication and social and career aspects for the next generation; and knowledge and technology transfer. “The ESG and the PPG have their work cut out and we look forward to further strong participation by the full community, in particular at the Open Symposium,” says Jakobs.

A briefing book prepared by the PPG based on the community input and discussions at the Open Symposium will be submitted to the ESG by the end of September for consideration during a five-day-long drafting session, which is scheduled to take place from 1 to 5 December. The CERN Council will then review the final ESG recommendations ahead of a special session to be held in Budapest in May 2026.

Machine learning in industry

Antoni Shtipliyski

In the past decade, machine learning has surged into every corner of industry, from travel and transport to healthcare and finance. For early-career researchers, who have spent their PhDs and postdocs coding, a job in machine learning may seem a natural next step.

“Scientists often study nature by attempting to model the world around us into math­ematical models and computer code,” says Antoni Shtipliyski, engineering manager at Skyscanner. “But that’s only one part of the story if the aim is to apply these models to large-scale research questions or business problems. A completely orthogonal set of challenges revolves around how people collaborate to build and operate these systems. That’s where the real work begins.”

Used to large-scale experiments and collaborative problem solving, particle physicists are uniquely well-equipped to step into machine-learning roles. Shtipliyski worked on upgrades for the level-1 trigger system of the CMS experiment at CERN, before leaving to lead the machine-learning operations team in one of the biggest travel companies in the world.

Effective mindset

“At CERN, building an experimental detector is just the first step,” says Shtipliyski. “To be useful, it needs to be operated effectively over a long period of time. That’s exactly the mindset needed in industry.”

During his time as a physicist, Shtipliyski gained multiple skills that continue to help him at work today, but there were also a number of other areas he developed to succeed in machine learning in industry. One critical gap in a physicists’ portfolio, he notes, is that many people interpret machine-learning careers as purely algorithmic development and model training.

“At Skyscanner, my team doesn’t build models directly,” he says. “We look after the platform used to push and serve machine-learning models to our users. We oversee the techno-social machine that delivers these models to travellers. That’s the part people underestimate, and where a lot of the challenges lie.”

An important factor for physicists transitioning out of academia is to understand the entire lifecycle of a machine-learning project. This includes not only developing an algorithm, but deploying it, monitoring its performance, adapting it to changing conditions and ensuring that it serves business or user needs.

Learning to write and communicate yourself is incredibly powerful

“In practice, you often find new ways that machine-learning models surprise you,” says Shtipliyski. “So having flexibility and confidence that the evolved system still works is key. In physics we’re used to big experiments like CMS being designed 20 years before being built. By the time it’s operational, it’s adapted so much from the original spec. It’s no different with machine-learning systems.”

This ability to live with ambiguity and work through evolving systems is one of the strongest foundations physicists can bring. But large complex systems cannot be built alone, so companies will be looking for examples of soft skills: teamwork, collaboration, communication and leadership.

“Most people don’t emphasise these skills, but I found them to be among the most useful,” Shtipliyski says. “Learning to write and communicate yourself is incredibly powerful. Being able to clearly express what you’re doing and why you’re doing it, especially in high-trust environments, makes everything else easier. It’s something I also look for when I do hiring.”

Industry may not offer the same depth of exploration as academia, but it does offer something equally valuable: breadth, variety and a dynamic environment. Work evolves fast, deadlines come more readily and teams are constantly changing.

“In academia, things tend to move more slowly. You’re encouraged to go deep into one specific niche,” says Shtipliyski. “In industry, you often move faster and are sometimes more shallow. But if you can combine the depth of thought from academia with the breadth of experience from industry, that’s a winning combination.”

Applied skills

For physicists eyeing a career in machine learning, the most they can do is to familiarise themselves with tools and practices for building and deploying models. Show that you can use the skills developed in academia and apply them to other environments. This tells recruiters that you have a willingness to learn, and is a simple but effective way of demonstrating commitment to a project from start to finish, beyond your assigned work.

“People coming from physics or mathematics might want to spend more time on implementation,” says Shtipliyski. “Even if you follow a guided walkthrough online, or complete classes on Coursera, going through the whole process of implementing things from scratch teaches you a lot. This puts you in a position to reason about the big picture and shows employers your willingness to stretch yourself, to make trade-offs and to evaluate your work critically.”

A common misconception is that practicing machine learning outside of academia is somehow less rigorous or less meaningful. But in many ways, it can be more demanding.

Scientific development is often driven by arguments of beauty and robustness. In industry, there’s less patience for that,” he says. “You have to apply it to a real-world domain – finance, travel, healthcare. That domain shapes everything: your constraints, your models, even your ethics.”

Shtipliyski emphasises that the technical side of machine learning is only one half of the equation. The other half is organisational: helping teams work together, navigate constraints and build systems that evolve over time. Physicists would benefit from exploring different business domains to understand how machine learning is used in different contexts. For example, GDPR constraints make privacy a critical issue in healthcare and tech. Learning how government funding is distributed throughout each project, as well as understanding how to build a trusting relationship between the funding agencies and the team, is equally important.

“A lot of my day-to-day work is just passing information, helping people build a shared mental model,” he says. “Trust is earned by being vulnerable yourself, which allows others to be vulnerable in turn. Once that happens, you can solve almost any problem.”

Taking the lead

Particle physicists are used to working in high-stakes, international teams, so this collaborative mindset is engrained in their training. But many may not have had the opportunity to lead, manage or take responsibility for an entire project from start to finish.

“In CMS, I did not have a lot of say due to the complexity and scale of the project, but I was able to make meaningful contributions in the validation and running of the detector,” says Shtipliyski. “But what I did not get much exposure to was the end-to-end experience, and that’s something employers really want to see.”

This does not mean you need to be a project manager to gain leadership experience. Early-career researchers have the chance to up-skill when mentoring a newcomer, help improve the team’s workflow in a proactive way, or network with other physicists and think outside the box.

You can be the dedicated expert in the room, even if you’re new. That feels really empowering

“Even if you just shadow an existing project, if you can talk confidently about what was done, why it was done and how it might be done differently – that’s huge.”

Many early-career researchers hesitate prior to leaving academia. They worry about making the “wrong” choice, or being labelled as a “finance person” or “tech person” as soon as they enter another industry. This is something Shtipliyski struggled to reckon with, but eventually realised that such labels do not define you.

“It was tough at CERN trying to anticipate what comes next,” he admits. “I thought that I could only have one first job. What if it’s the wrong one? But once a scientist, always a scientist. You carry your experiences with you.”

Shtipliyski quickly learnt that industry operates under a different set of rules: where everyone comes from a different background, and the levels of expertise differ depending on the person you will speak to next. Having faced intense imposter syndrome at CERN – having shared spaces with world-leading experts – industry offered Shtipliyski a more level playing field.

“In academia, there’s a kind of ladder: the longer you stay, the better you get. In industry, it’s not like that,” says Shtipliyski. “You can be the dedicated expert in the room, even if you’re new. That feels really empowering.”

Industry rewards adaptability as much as expertise. For physicists stepping beyond academia, the challenge is not abandoning their training, but expanding it – learning to navigate ambiguity, communicate clearly and understand the full lifecycle of real-world systems. Harnessing a scientist’s natural curiosity, and demonstrating flexibility, allows the transition to become less about leaving science behind, and more about discovering new ways to apply it.

“You are the collection of your past experiences,” says Shtipliyski. “You have the freedom to shape the future.”

DESI hints at evolving dark energy

The dynamics of the universe depend on a delicate balance between gravitational attraction from matter and the repulsive effect of dark energy. A universe containing only matter would eventually slow down its expansion due to gravitational forces and possibly recollapse. However, observations of Type Ia supernovae in the late 1990s revealed that our universe’s expansion is in fact accelerating, requiring the introduction of dark energy. The standard cosmological model, called the Lambda Cold Dark Matter (ΛCDM) model, provides an elegant and robust explanation of cosmological observations by including normal matter, cold dark matter (CDM) and dark energy. It is the foundation of our current understanding of the universe.

Cosmological constant

In ΛCDM, Λ refers to the cosmological constant – a parameter introduced by Albert Einstein to counter the effect of gravity in his pursuit of a static universe. With the knowledge that the universe is accelerating, Λ is now used to quantify this acceleration. An important parameter that describes dark energy, and therefore influences the evolution of the universe, is its equation-of-state parameter, w. This value relates the pressure dark energy exerts on the universe, p, to its energy density, ρ, via p = wρ. Within ΛCDM, w is –1 and ρ is constant – a combination that has to date explained observations well. However, new results by the Dark Energy Spectroscopic Instrument (DESI) put these assumptions under increasing stress.

These new results are part of the second data release (DR2) from DESI. Mounted on the Nicholas U Mayall 4-metre telescope at Kitt Peak National Observatory in
Arizona, DESI is optimised to measure the spectra of a large number of objects in the sky simultaneously. Joint observations are possible thanks to 5000 optical fibres controlled through robots, which continuously optimise the focal plane of the detector. Combined with a highly efficient processing pipeline, this allows DESI to perform detailed simultaneous spectrometer measurements of a large number of objects in the sky, resulting in a catalogue of measurements of the distance of objects based on their velocity-induced shift in wavelength, or redshift. For its first data release, DESI used 6 million such redshifts, allowing it to show that w was several sigma away from its expected value of –1 (
CERN Courier May/June 2024 p11). For DR2, 14 million measurements are used, enough to provide strong hints of w changing with time.

The first studies of the expansion rate of the universe were based on redshift measurements of local objects, such as supernovae. As the objects are relatively close, they provide data on the acceleration at small redshifts. An alternative method is to use the cosmic microwave background (CMB), which allows for measurements of the evolution of the early universe through complex imprints left on the current distribution of the CMB. The significantly smaller expansion rate measured through the CMB compared to local measurements resulted in a “Hubble tension”, prompting novel measurements to resolve or explain the observed difference (CERN Courier March/April 2025 p28). One such attempt comes from DESI, which aims to provide a detailed 3D map of the universe focusing on the distance between galaxies to measure the expansion (see “3D map” figure).

Tension with ΛCDM

The 3D map produced by DESI can be used to study the evolution of the universe as it holds imprints from small fluctuations in the density of the early universe. These density fluctuations have been studied through their imprint on the CMB, however, they also left imprints in the distribution of baryonic matter until the age of recombination occurred. The variations in baryonic density grew over time into the varying densities of galaxies and other large-scale structures that are observed today.

The regions originally containing higher baryon densities are now those with larger densities of galaxies. Exactly how the matter-density fluctuations evolved into variations in galaxy densities throughout the universe depends on a range of parameters from the ΛCDM model, including w. The detailed map of the universe produced by DESI, which contains a range of objects with redshifts up to 2.5, can therefore be fitted against the ΛCDM model.

Among other studies, the latest data from DESI was combined with that of CMB observations and fitted to the ΛCDM model. This worked relatively well, although it requires a lower matter-density parameter than found from CMB data alone. However, using the resulting cosmological parameters results in a poor match with the data for the early universe coming from supernova measurements. Similarly, fitting the ΛCDM model using the supernova data results in poor agreement with both the DESI and CMB data, thereby putting some strain on the ΛCDM model. Things don’t get significantly better when adding some freedom in these analyses by allowing w to differ from –1.

The new data release provides significant evidence of a deviation from the ΛCDM model

An adaption of the ΛCDM model that results in an agreement with all three datasets requires w to evolve with redshift, or time. The implications for the acceleration of the universe based on these results are shown in the “Tension with ΛCDM” figure, which shows the deceleration rate of the expansion of the universe as a function of redshift. q < 0 implies an accelerating universe. In the ΛCDM model, acceleration increases with time, as redshift approaches 0. DESI data suggests that the acceleration of the universe started earlier, but is currently less than that predicted by ΛCDM.

Although this model matches the data well, a theoretical explanation is difficult. In particular, the data implies that w(z) was below –1, which translates into an energy density that increases with the expansion; however, the energy density seems to have peaked at a redshift of 0.45 and is now decreasing.

Overall, the new data release provides significant evidence of a deviation from the ΛCDM model. The exact significance depends on the specific analysis and which data sets are combined, however, all such studies provide similar results. As no 5σ discrepancy is found yet, there is no reason to discard ΛCDM, though this could change with another two years of DESI data coming up, along with data from the European Euclid mission, Vera C Rubin Observatory, and the Nancy Grace Roman Space Telescope. Each will provide new insights into the expansion for various redshift periods.

FCC feasibility study complete

The final report of a detailed study investigating the technical and financial feasibility of a Future Circular Collider (FCC) at CERN was released on 31 March. Building on a conceptual design study conducted between 2014 and 2018, the three-volume report is authored by over 1400 scientists and engineers in more than 400 institutes worldwide, and covers aspects of the project ranging from civil engineering to socioeconomic impact. As recommended in the 2020 update to the European Strategy for Particle Physics (ESPP), it was completed in time to serve as an input to the ongoing 2026 update to the ESPP (see “European strategy update: the community speaks“).

The FCC is a proposed collider infrastructure that could succeed the LHC in the 2040s. Its scientific motivation stems from the discovery in 2012 of the final particle of the Standard Model (SM), the Higgs boson, with a mass of just 125 GeV, and the wealth of precision measurements and exploratory searches during 15 years of LHC operations that have excluded many signatures of new physics at the TeV scale. The report argues that the FCC is particularly well equipped to study the Higgs and associated electroweak sectors in detail and that it provides a broad and powerful exploratory tool that would push the limits of the unknown as far as possible.

The report describes how the FCC will seek to address key domains formulated in the 2013 and 2020 ESPP updates, including: mapping the properties of the Higgs and electroweak gauge bosons with accuracies orders of magnitude better than today to probe the processes that led to the emergence of the Brout–Englert–Higgs field’s nonzero vacuum expectation value; ensuring a comprehensive and accurate campaign of precision electroweak, quantum chromodynamics, flavour and top-quark measurements sensitive to tiny deviations from the SM, probing energy scales far beyond the direct kinematic reach; improving by orders of magnitude the sensitivity to rare and elusive phenomena at low energies, including the possible discovery of light particles with very small couplings such as those relevant to the search for dark matter; and increasing by at least an order of magnitude the direct discovery reach for new particles at the energy frontier.

This technology has significant potential for industrial and societal applications

The FCC research programme outlines two possible stages: an electron–positron collider (FCC-ee) running at several centre-of-mass energies to serve as a Higgs, electroweak and top-quark factory, followed at a later stage by a proton–proton collider (FCC-hh) operating at an unprecedented collision energy. An FCC-ee with four detectors is judged to be “the electroweak, Higgs and top factory project with the highest luminosity proposed to date”, able to produce 6 × 1012 Z bosons, 2.4 × 108 W pairs, almost 3 × 106 Higgs bosons, and 2 × 106 top-quark pairs over 15 years of operations. Its versatile RF system would enable flexibility in the running sequence, states the report, allowing experimenters to move between physics programmes and scan through energies at ease. The report also outlines how the FCC-ee injector offers opportunities for other branches of science, including the production of spatially coherent photon beams with a brightness several orders of magnitude higher than any existing or planned light source.

The estimated cost of the construction of the FCC-ee is CHF 15.3 billion. This investment, which would be distributed over a period of about 15 years starting from the early 2030s, includes civil engineering, technical infrastructure, electron and positron accelerators, and four detectors.

Ready for construction

The report describes how key FCC-ee design approaches, such as a double-ring layout, top-up injection with a full-energy booster, a crab-waist collision scheme, and precise energy calibration, have been demonstrated at several previous or presently operating colliders. The FCC-ee is thus “technically ready for construction” and is projected to deliver four-to-five orders of magnitude higher luminosity per unit electrical power than LEP. During operation, its energy consumption is estimated to vary
from 1.1 to 1.8 TWh/y depending on the operation mode compared to CERN’s current consumption of about 1.3 TWh/y. Decarbonised energy including an ever-growing contribution from renewable sources would be the main source of energy for the FCC. Ongoing technology R&D aims at further increasing FCC-ee’s energy efficiency (see “Powering into the future”).

Assuming 14 T Nb3Sn magnet technology as a baseline design, a subsequent hadron collider with a centre-of-mass energy of 85 TeV entering operation in the early 2070s would extend the energy frontier by a factor six and provide an integrated luminosity five to 10 times higher than that of the HL-LHC during 25 years of operation. With four detectors, FCC-hh would increase the mass reach of direct searches for new particles to several tens of TeV, probing a broad spectrum of beyond-the-SM theories and potentially identifying the sources of any deviations found in precision measurements at FCC-ee, especially those involving the Higgs boson. An estimated sample of more than 20 billion Higgs bosons would allow the absolute determination of its couplings to muons, to photons, to the top quark and to Zγ below the percent level, while di-Higgs production would bring the uncertainty on the Higgs self-coupling below the 5% level. FCC-hh would also significantly advance understanding of the hot QCD medium by enabling lead–lead and other heavy-ion collisions at unprecedented energies, and could be configured to provide electron–proton and electron–ion collisions, says the report.

The FCC-hh design is based on LHC experience and would leverage a substantial amount of the technical infrastructure built for the first FCC stage. Two hadron injector options are under study involving a superconducting machine in either the LHC or SPS tunnel. For the purpose of a technical feasibility analysis, a reference scenario based on 14 T Nb3Sn magnets cooled to 1.9 K was considered, yielding 2.4 MW of synchrotron radiation and a power consumption of 360 MW or 2.3 TWh/y – a comparable power consumption to FCC-ee.

FCC-hh’s power consumption might be reduced below 300 MW if the magnet temperature can be raised to 4.5 K. Outlining the potential use of high-
temperature superconductors for 14 to 20 T dipole magnets operating at temperatures between 4.5 K and 20 K, the report notes that such technology could either extend the centre-of-mass energy of FCC-hh to 120 TeV or lead to significantly improved operational sustainability at the same collision energy. “The time window of more than 25 years opened by the lepton-collider stage is long enough to bring that technology to market maturity,” says FCC study leader Michael Benedikt  (CERN). “High-temperature superconductors have significant potential for industrial and societal applications, and particle accelerators can serve as pilots for market uptake, as was the case with the Tevatron and the LHC for NbTi technology.”

Society and sustainability

The report details the concepts and paths to keep the FCC’s environmental footprint low while boosting new technologies to benefit society and developing territorial synergies such as energy reuse. The civil construction process for FCC-ee, which would also serve FCC-hh, is estimated to result in about 500,000 tCO2(eq) over a period of 10 years, which the authors say corresponds to approximately one-third of the carbon budget of the Paris Olympic Games. A socio-economic impact assessment of the FCC integrating environmental aspects throughout its entire lifecycle reveals a positive cost–benefit ratio, even under conservative assumptions and adverse implementation conditions.

The actual journey towards the realisation of the FCC starts now

A major achievement of the FCC feasibility study has been the development of the layout and placement of the collider ring and related infrastructure, which have been optimised for scientific benefit while taking into account territorial compatibility, environmental and construction constraints, and cost. No fewer than 100 scenarios were developed and analysed before settling on the preferred option: a ring circumference of 90.7 km with shaft depths ranging between 200 and 400 m, with eight surface sites and four experiments. Throughout the study, CERN has been accompanied by its host states, France and Switzerland, working with entities at the local, regional and national levels to ensure a constructive dialogue with territorial stakeholders.

The final report of the FCC feasibility study together with numerous referenced technical documents have been submitted to the ongoing ESPP 2026 update, along with studies of alternative projects proposed by the community. The CERN Council may take a decision around 2028.

“After four years of effort, perseverance and creativity, the FCC feasibility study was concluded on 31 March 2025,” says Benedikt. “The actual journey towards the realisation of the FCC starts now and promises to be at least as fascinating as the successive steps that brought us to the present state.”

Gravitational remnants in the sky

Astrophysical gravitational waves have revolutionised astronomy; the eventual detection of cosmological gravitons promises to open an otherwise inaccessible window into the universe’s earliest moments. Such a discovery would offer profound insights into the hidden corners of the early universe and physics beyond the Standard Model. Relic Gravitons, by Massimo Giovannini of INFN Milan Bicocca, offers a timely and authoritative guide to the most exciting frontiers in modern cosmology and particle physics.

Giovannini is an esteemed scholar and household name in the fields of theoretical cosmology and early-universe physics. He has written influential research papers, reviews and books on cosmology, providing detailed discussions on several aspects of the early universe. He also authored 2008’s A Primer on the Physics of the Cosmic Microwave Background – a book most cosmologists are very familiar with.

In Relic Gravitons, Giovannini provides a comprehensive exploration of recent developments in the field, striking a remarkable balance between clarity, physical intuition and rigorous mathematical formalism. As such, it serves as an excellent reference – equally valuable for both junior researchers and seasoned experts seeking depth and insight into theoretical cosmology and particle physics.

Relic Gravitons opens with an overview of cosmological gravitons, offering a broad perspective on gravitational waves across different scales and cosmological epochs, while drawing parallels with the electromagnetic spectrum. This graceful introduction sets the stage for a well-contextualised and structured discussion.

Gravitational rainbow

Relic gravitational waves from the early universe span 30 orders of magnitude, from attohertz to gigahertz. Their wavelengths are constrained from above by the Hubble radius, setting a lower frequency bound of 10–18 Hz. At the lowest frequencies, measurements of the cosmic microwave background (CMB) provide the most sensitive probe of gravitational waves. In the nanohertz range, pulsar timing arrays serve as powerful astrophysical detectors. At intermediate frequencies, laser and atomic interferometers are actively probing the spectrum. At higher frequencies, only wide-band interferometers such as LIGO and Virgo currently operate, primarily within the audio band spanning from a few hertz to several kilohertz.

Relic Gravitons

The theoretical foundation begins with a clear and accessible introduction to tensor modes in flat spacetime, followed by spherical harmonics and polarisations. With these basics in place, tensor modes in curved spacetime are also explored, before progressing to effective action, the quantum mechanics of relic gravitons and effective energy density. This structured progression builds a solid framework for phenomenological applications.

The second part of the book is about the signals of the concordance paradigm, which includes discussions of Sakharov oscillations, short, intermediate and long wavelengths, before entering technical interludes in the next section. Here, Giovannini emphasises that the evolution of the comoving Hubble radius is uncertain, spectral energy density and other observables require approximate methods. The chapter expands to include conventional results using the Wentzel–Kramers–Brillouin approach, which is particularly useful when early-universe dynamics deviate from standard inflation.

Phenomenological implications are discussed in the final section, starting with the low-frequency branch that covers the analysis of the phenomenological implications in the lowest-frequency domain. Giovannini then examines the intermediate and high-frequency ranges. The concordance paradigm suggests that large-scale inhomogeneities originate from quantum mechanics, where traveling waves transform into standing waves. The penultimate chapter addresses the hot topic of the “quantumness” of relic gravitons, before diving into the conclusion. The book finishes with five appendices covering all sorts of useful topics, from notation to basic related topics in general relativity and cosmic perturbations.

Relic Gravitons is a must-read for anyone intrigued by the gravitational-wave background and its unparalleled potential to unveil new physics. It is an invaluable resource for those interested in gravitational waves and the unique potential to explore the unknown parts of particle physics and cosmology.

Colour information diffuses in Frankfurt

Quark Matter 2025

The 31st Quark Matter conference took place from 6 to 12 April at Goethe University in Frankfurt, Germany. This edition of the world’s flagship conference for ultra-relativistic heavy-ion physics was the best attended in the series’ history, with more than 1000 participants.

A host of experimental measurements and theoretical calculations targeted fundamental questions in many-body QCD. These included the search for a critical point along the QCD phase diagram, the extraction of the properties of the deconfined quark–gluon plasma (QGP) medium created in heavy-ion collisions, and the search for signatures of the formation of this deconfined medium in smaller collision systems.

Probing thermalisation

New results highlighted the ability of the strong force to thermalise the out-of-equilibrium QCD matter produced during the collisions. Thermalisation can be probed by taking advantage of spatial anisotropies in the initial collision geometry which, due to the rapid onset of strong interactions at early times, result in pressure gradients across the system. These pressure gradients in turn translate into a momentum-space anisotropy of produced particles in the bulk, which can be experimentally measured by taking a Fourier transform of the azimuthal distribution of final-state particles with respect to a reference event axis.

An area of active experimental and theoretical interest is to quantify the degree to which heavy quarks, such as charm and beauty, participate in this collective behaviour, which informs on the diffusion properties of the medium. The ALICE collaboration presented the first measurement of the second-order coefficient of the momentum anisotropy of charm baryons in Pb–Pb collisions, showing significant collective behaviour and suggesting that charm quarks undergo some degree of thermalisation. This collective behaviour appears to be stronger in charm baryons than charm mesons, following similar observations for light flavour.

A host of measurements and calculations targeted fundamental questions in many-body QCD

Due to the nature of thermalisation and the long hydrodynamic phase of the medium in Pb–Pb collisions, signatures of the microscopic dynamics giving rise to the thermalisation are often washed out in bulk observables. However, local excitations of the hydrodynamic medium, caused by the propagation of a high-energy jet through the QGP, can offer a window into such dynamics. Due to coupling to the coloured medium, the jet loses energy to the QGP, which in turn re-excites the thermalised medium. These excited states quickly decay and dissipate, and the local perturbation can partially thermalise. This results in a correlated response of the medium in the direction of the propagating jet, the distribution of which allows measurement of the thermalisation properties of the medium in a more controlled manner.

In this direction, the CMS collaboration presented the first measurement of an event-wise two-point energy–energy correlator, for events containing a Z boson, in both pp and Pb–Pb collisions. The two-point correlator represents the energy-weighted cross section of the angle between particle pairs in the event and can separate out QCD effects at different scales, as these populate different regions in angular phase space. In particular, the correlated response of the medium is expected to appear at large angles in the correlator in Pb–Pb collisions.

The use of a colourless Z boson, which does not interact in the QGP, allows CMS to compare events with similar initial virtuality scales in pp and Pb–Pb collisions, without incurring biases due to energy loss in the QCD probes. The collaboration showed modifications in the two-point correlator at large angles, from pp to Pb–Pb collisions, alluding to a possible signature of the correlated response of the medium to the traversing jets. Such measurements can help guide models into capturing the relevant physical processes underpinning the diffusion of colour information in the medium.

Looking to the future

The next addition of this conference series will take place in 2027 in Jeju, South Korea, and the new results presented there should notably contain the latest complement of results from the upgraded Run 3 detectors at the LHC and the newly commissioned sPHENIX detector at RHIC. New collision systems like O–O at the LHC will help shed light on many of the properties of the QGP, including its thermalisation, by varying the lifetime of the pre-equilibrium and hydrodynamic phases in the collision evolution.

PhyStat turns 25

Confidence intervals

On 16 January, physicists and statisticians met in the CERN Council Chamber to celebrate 25 years of the PhyStat series of conferences, workshops and seminars, which bring together physicists, statisticians and scientists from related fields to discuss, develop and disseminate methods for statistical data analysis and machine learning.

The special symposium heard from the founder and primary organiser of the PhyStat series Louis Lyons (Imperial College London and University of Oxford), who together with Fred James and Yves Perrin initiated the movement with the “Workshop on Confidence Limits” in January 2000. According to Lyons, the series was to bring together physicists and statisticians, a philosophy that has been followed and extended throughout the 22 PhyStat workshops and conferences, as well as numerous seminars and “informal reviews”. Speakers called attention to recognition from the Royal Statistical Society’s pictorial timeline of statistics, starting with the use of averages by Hippias of Elis in 450 BC and culminating with the 2012 discovery of the Higgs boson with 5σ significance.

Lyons and Bob Cousins (UCLA) offered their views on the evolution of statistical practice in high-energy physics, starting in the 1960s bubble-chamber era, strongly influenced by the 1971 book Statistical Methods in Experimental Physics by W T Eadie et al., its 2006 second edition by symposium participant Fred James (CERN), as well as Statistics for Nuclear and Particle Physics (1985) by Louis Lyons – reportedly the most stolen book from the CERN library. Both Lyons and Cousins noted the interest of the PhyStat community not only in practical solutions to concrete problems but also in foundational questions in statistics, with the focus on frequentist methods setting high-energy physics somewhat apart from the Bayesian approach more widely used in astrophysics.

Giving his view of the PhyStat era, ATLAS physicist and director of the University of Wisconsin Data Science Institute Kyle Cranmer emphasised the enormous impact that PhyStat has had on the field, noting important milestones such as the ability to publish full likelihood models through the statistical package RooStats, the treatment of systematic uncertainties with profile-likelihood ratio analyses, methods for combining analyses, and the reuse of published analyses to place constraints on new physics models. In regards to the next 25 years, Cranmer predicted the increasing use of methods that have emerged from PhyStat, such as simulation-based inference, and pointed out that artificial intelligence (the elephant in the room) could drastically alter how we use statistics.

Statistician Mikael Kuusela (CMU) noted that Phystat workshops have provided important two-way communication between the physics and statistics communities, citing simulation-based inference as an example where many key ideas were first developed in physics and later adopted by statisticians. In his view, the use of statistics in particle physics has emerged as “phystatistics”, a proper subfield with distinct problems and methods.

Another important feature of the PhyStat movement has been to encourage active participation and leadership by younger members of the community.  With its 25th anniversary, the torch is now passed from Louis Lyons to Olaf Behnke (DESY), Lydia Brenner (NIKHEF) and a younger team, who will guide Phystat into the next 25 years and beyond.

Gaseous detectors school at CERN

How do wire-based detectors compare to resistive-plate chambers? How well do micropattern gaseous detectors perform? Which gas mixtures optimise operation? How will detectors face the challenges of future more powerful accelerators?

Thirty-two students attended the first DRD1 Gaseous Detectors School at CERN last November. The EP-DT Gas Detectors Development (GDD) lab hosted academic lectures and varied hands-on laboratory exercises. Students assembled their own detectors, learnt about their operating characteristics and explored radiation-imaging methods with state-of-the-art readout approaches – all under the instruction of more than 40 distinguished lecturers and tutors, including renowned scientists, pioneers of innovative technologies and emerging experts.

DRD1 is a new worldwide collaborative framework of more than 170 institutes focused on R&D for gaseous detectors. The collaboration focuses on knowledge sharing and scientific exchange, in addition to the development of novel gaseous detector technologies to address the needs of future experiments. This instrumentation school, initiated in DRD1’s first year, marks the start of a series of regular training events for young researchers that will also serve to exchange ideas between research groups and encourage collaboration.

The school will take place annually, with future editions hosted at different DRD1 member institutes to reach students from a number of regions and communities.

bright-rec iop pub iop-science physcis connect