Comsol -leaderboard other pages

Topics

Double plasma progress at DESY

What if, instead of using tonnes of metal to accelerate electrons, they were to “surf” on a wave of charge displacements in a plasma? This question, posed in 1979 by Toshiki Tajima and John Dawson, planted the seed for plasma wakefield acceleration (PWA). Scientists at DESY now report some of the first signs that PWA is ready to compete with traditional accelerators at low energies. The results tackle two of the biggest challenges in PWA: beam quality and bunch rate.

“We have made great progress in the field of plasma acceleration,” says Andreas Maier, DESY’s lead scientist for plasma acceleration, “but this is an endeavour that has only just started, and we still have a bit of homework to do to get the system integrated with the injector complexes of a synchrotron, which is our final goal.”

Riding a wave

PWA has the potential to radically miniaturise particle accelerators. Plasma waves are generated when a laser pulse or particle beam ploughs through a millimetres-long hydrogen-filled capillary, displacing electrons and creating a wake of alternating positive and negative charge regions behind it. The process is akin to flotsam and jetsam being accelerated in the wake of a speedboat, and the plasma “wakefields” can be thousands of times stronger than the electric fields in conventional accelerators, allowing particles to gain hundreds of MeV in just a few millimetres. But beam quality and intensity are significant challenges in such narrow confines.

In a first study, a team from the LUX experiment at DESY and the University of Hamburg demonstrated, for the first time, a two-stage correction system to dramatically reduce the energy spread of accelerated electron beams. The first stage stretches the longitudinal extent of the beam from a few femtoseconds to several picoseconds using a series of four zigzagging bending magnets called a magnetic chicane. Next, a radio-frequency cavity reduces the energy variation to below 0.1%, bringing the beam quality in line with conventional accelerators.

“We basically trade beam current for energy stability,” explains Paul Winkler, lead author of a recent publication on active energy compression. “But for the intended application of a synchrotron injector, we would need to stretch the electron bunches anyway. As a result, we achieved performance levels so far only associated with conventional accelerators.”

But producing high-quality beams is only half the battle. To make laser-driven PWA a practical proposition, bunches must be accelerated not just once a second, like at LUX, but hundreds or thousands of times per second. This has now been demonstrated by KALDERA, DESY’s new high-power laser system (see “Beam quality and bunch rate” image).

“Already, on the first try, we were able to accelerate 100 electron bunches per second,” says principal investigator Manuel Kirchen, who emphasises the complementarity of the two advances. The team now plans to scale up the energy and deploy “active stabilisation” to improve beam quality. “The next major goal is to demonstrate that we can contin­uously run the plasma accelerators with high stability,” he says.

With the exception of CERN’s AWAKE experiment (CERN Courier May/June 2024 p25), almost all plasma-wakefield accelerators are designed with medical or industrial applications in mind. Medical applications are particularly promising as they require lower beam energies and place less demanding constraints on beam quality. Advances such as those reported by LUX and KALDERA raise confidence in this new technology and could eventually open the door to cheaper and more portable X-ray equipment, allowing medical imaging and cancer therapy to take place in university labs and hospitals.

Plotting the discovery of Higgs pairs on Elba

Precise measurements of the Higgs self-coupling and its effects on the Higgs potential will play a key role in testing the validity of the Standard Model (SM). 150 physicists discussed the required experimental and theoretical manoeuvres on the serene island of Elba from 11 to 17 May at the Higgs Pairs 2025 workshop.

The conference mixed updates on theoretical developments in Higgs-boson pair production, searches for new physics in the scalar sector, and the most recent results from Run 2 and Run 3 of the LHC. Among the highlights was the first Run 3 analysis released by ATLAS on the search for di-Higgs production in the bbγγ final state – a particularly sensitive channel for probing the Higgs self-coupling. This result builds on earlier Run 2 analyses and demonstrates significantly improved sensitivity, now comparable to the full Run 2 combination of all channels. These gains were driven by the use of new b-tagging algorithms, improved mass resolution through updated analysis techniques, and the availability of nearly twice the dataset.

Complementing this, CMS presented the first search for ttHH production – a rare process that would provide additional sensitivity to the Higgs self-coupling and Higgs–top interactions. Alongside this, ATLAS presented first experimental searches for triple Higgs boson production (HHH), one of the rarest processes predicted by the SM. Work on more traditional final states such as bbττ and bbbb is ongoing at both experiments, and continues to benefit from improved reconstruction techniques and larger datasets. 

Beyond current data, the workshop featured discussions of the latest combined projection study by ATLAS and CMS, prepared as part of the input to the upcoming European Strategy Update. It extrapolates results of the Run 2 analyses to expected conditions of the High-Luminosity LHC (HL-LHC), estimating future sensitivities to the Higgs self-coupling and di-Higgs cross-section in scenarios with vastly higher luminosity and upgraded detectors. Under these assumptions, the combined sensitivity of ATLAS and CMS to di-Higgs production is projected to reach a significance of 7.6σ, firmly establishing the process. 

These projections provide crucial input for analysis strategy planning and detector design for the next phase of operations at the HL-LHC. Beyond the HL-LHC, efforts are already underway to design experiments at future colliders that will enhance sensitivity to the production of Higgs pairs, and offer new insights into electroweak symmetry breaking.

New frontiers in science in the era of AI

New Frontiers in Science in the Era of AI

At a time when artificial intelligence is more buzzword than substance in many corners of public discourse, New Frontiers in Science in the Era of AI arrives with a clear mission: to contextualise AI within the long arc of scientific thought and current research frontiers. This book is not another breathless ode to ChatGPT or deep learning, nor a dry compilation of technical papers. Instead, it’s a broad and ambitious survey, spanning particle physics, evolutionary biology, neuroscience and AI ethics, that seeks to make sense of how emerging technologies are reshaping not only the sciences but knowledge and society more broadly.

The book’s chapters, written by established researchers from diverse fields, aim to avoid jargon while attracting non-specialists, without compromising depth. The book offers an insight into how physics remains foundational across scientific domains, and considers the social, ethical and philosophical implications of AI-driven science.

The first section, “New Physics World”, will be the most familiar terrain for physicists. Ugo Moschella’s essay, “What Are Things Made of? The History of Particles from Thales to Higgs”, opens with a sweeping yet grounded narrative of how metaphysical questions have persisted alongside empirical discoveries. He draws a bold parallel between the ancient idea of mass emerging from a cosmic vortex and the Higgs mechanism, a poetic analogy that holds surprising resonance. Thales, who lived roughly from 624 to 545 BCE, proposed that water is the fundamental substance out of which all others are formed. Following his revelation, Pythagoras and Empedocles added three more items to complete the list of the elements: earth, air and fire. Aristotle added a fifth element: the “aether”. The physical foundation of the standard cosmological model of the ancient world is then rooted in the Aristotelian conceptions of movement and gravity, argues Moschella. His essay lays the groundwork for future chapters that explore entanglement, computation and the transition from thought experiments to quantum technology and AI.

A broad and ambitious survey spanning particle physics, evolutionary biology, neuroscience and AI ethics

The second and third sections venture into evolutionary genetics, epigenetics (the study of heritable changes in gene expression) and neuroscience – areas more peripheral to physics, but timely nonetheless. Contributions by Eva Jablonka, evolutionary theorist and geneticist from Tel Aviv University, and Telmo Pievani, a biologist from the University of Padua, explore the biological implications of gene editing, environmental inheritance and self-directed evolution, as well as the ever-blurring boundaries between what is considered “natural” versus “artificial”. The authors propose that the human ability to edit genes is itself an evolutionary agent – a novel and unsettling idea, as this would be an evolution driven by a will and not by chance. Neuroscientist Jason D Runyan reflects compellingly on free will in the age of AI, blending empirical work with philosophical questions. These chapters enrich the central inquiry of what it means to be a “knowing agent”: someone who acts on nature according to its will, influenced by biological, cognitive and social factors. For physicists, the lesson may be less about adopting specific methods and more about recognising how their own field’s assumptions – about determinism, emergence or complexity – are echoed and challenged in the life sciences.

Perspectives on AI

The fourth section, “Artificial Intelligence Perspectives”, most directly addresses the book’s central theme. The quality, scientific depth and rigour are not equally distributed between these chapters, but are stimulating nonetheless. Topics range from the role of open-source AI in student-led AI projects at CERN’s IdeaSquare and real-time astrophysical discovery. Michael Coughlin and colleagues’ chapter on accelerated AI in astrophysics stands out for its technical clarity and relevance, a solid entry point for physicists curious about AI beyond popular discourse. Absent is an in-depth treatment of current AI applications in high-energy physics, such as anomaly detection in LHC triggers or generative models for simulation. Given the book’s CERN affiliations, this omission is surprising and leaves out some of the most active intersections of AI and high-energy physics (HEP) research.

Even as AI expands our modelling capacity, the epistemic limits of human cognition may remain permanent

The final sections address cosmological mysteries and the epistemological limits of human cognition. David H Wolpert’s epilogue, “What Can We Know About That Which We Cannot Even Imagine?”, serves as a reminder that even as AI expands our modelling capacity, the epistemic limits of human cognition – including conceptual blind spots and unprovable truths – may remain permanent. This tension is not a contradiction but a sobering reflection on the intrinsic boundaries of scientific – and more widely human – knowledge.

This eclectic volume is best read as a reflective companion to one’s own work. For advanced students, postdocs and researchers open to thinking beyond disciplinary boundaries, the book is an enriching, if at times uneven, read.

To a professional scientist, the book occasionally romanticises interdisciplinary exchange between specialised fields without fully engaging with the real methodological difficulties of translating complex concepts to the other sciences. Topics including the limitations of current large-language models, the reproducibility crisis in AI research, and the ethical risks of data-driven surveillance would have benefited from deeper treatment. Ethical questions in HEP may be less prominent in the public eye, but still exist. To mention a few, there are the environmental impact of large-scale facilities, the question of spending a substantial amount of public money on such mega-science projects, the potential dual-use concerns of the technologies developed, the governance of massive international collaborations and data transparency. These deserve more attention, and the book could have explored them more thoroughly.

A timely snapshot

Still, the book doesn’t pretend to be exhaustive. Its strength lies in curating diverse voices and offering a timely snapshot of science, as well as shedding light on ethical and philosophical questions associated with science that are less frequently discussed.

There is a vast knowledge gap in today’s society. Researchers often become so absorbed in their specific domains that they lose sight of their work’s broader philosophical and societal context and the need to explain it to the public. Meanwhile, public misunderstanding of science, and the resulting confusion between fact, theory and opinion, is growing. This gulf provides fertile ground for political manipulation and ideological extremism. New Frontiers in Science in the Era of AI has the immense merit of trying to bridge that gap. The editors and contributors deserve credit for producing a work of both scientific and societal relevance.

Quantum culture

Kanta Dihal

How has quantum mechanics influenced culture in the last 100 years?

Quantum physics offers an opportunity to make the impossible seem plausible. For instance, if your superhero dies dramatically but the actor is still on the payroll, you have a few options available. You could pretend the hero miraculously survived the calamity of the previous instalment. You could also pretend the events of the previous instalment never happened. And then there is Star Wars: “Somehow, Palpatine returned.”

These days, however, quantum physics tends to come to the rescue. Because quantum physics offers the wonderful option to maintain that all previous events really happened, and yet your hero is still alive… in a parallel universe. Much is down to the remarkable cultural impact of the many-worlds interpretation of quantum physics, which has been steadily growing in fame (or notoriety) since Hugh Everett introduced it
in 1957.

Is quantum physics unique in helping fiction authors make the impossible seem possible?

Not really! Before the “quantum” handwave, there was “nuclear”: think of Dr Atomic from Watchmen, or Godzilla, as expressions of the utopian and dystopian expectations of that newly discovered branch of science. Before nuclear, there was electricity, with Frankenstein’s monster as perhaps its most important product. We can go all the way back to the invention of hydraulics in the ancient world, which led to an explosion of tales of liquid-operated automata – early forms of artificial intelligence – such as the bronze soldier Talos in ancient Greece. We have always used our latest discoveries to dream of a future in which our ancient tales of wonder could come true.

Is the many-worlds interpretation the most common theory used in science fiction inspired by quantum mechanics?

Many-worlds has become Marvel’s favourite trope. It allows them to expand on an increasingly entangled web of storylines that borrow from a range of remakes and reboots, as well as introducing gender and racial diversity into old stories. Marvel may have mainstreamed this interpretation, but the viewers of the average blockbuster may not realise exactly how niche it is, and how many alternatives there are. With many interpretations vying for acceptance, every once in a while a brave social scientist ventures to survey quantum-physicists’ preferences. These studies tend to confirm the dominance of the Copenhagen interpretation, with its collapse of the wavefunction rather than the branching universes characteristic of the Everett interpretation. In a 2016 study, for instance, only 6% of quantum physicists claimed that Everett was their favourite interpretation. In 2018 I looked through a stack of popular quantum-physics books published between 1980 and 2017, and found that more than half of these books endorse the many-worlds interpretation. A non-physicist might be forgiven for thinking that quantum physicists are split between two equal-sized enemy camps of Copenhagenists and Everettians.

What makes the many-worlds interpretation so compelling?

Answering this brings us to a fundamental question that fiction has enjoyed exploring since humans first told each other stories: what if? “What if the Nazis won the Second World War?” is pretty much an entire genre by itself these days. Before that, there were alternate histories of the American Civil War and many other key historical events. This means that the many-worlds interpretation fits smoothly into an existing narrative genre. It suggests that these alternate histories may be real, that they are potentially accessible to us and simply happening in a different dimension. Even the specific idea of branching alternative universes existed in fiction before Hugh Everett applied it to quantum mechanics. One famous example is the 1941 short story The Garden of Forking Paths by the Argentinian writer Jorge Luis Borges, in which a writer tries to create a novel in which everything that could happen, happens. His story anticipated the many-worlds interpretation so closely that Bryce DeWitt used an extract from it as the epigraph to his 1973 edited collection The Many-Worlds Interpretation of Quantum Mechanics. But the most uncanny example is, perhaps, Andre Norton’s science-fiction novel The Crossroads of Time, from 1956 – published when Everett was writing his thesis. In her novel, a group of historians invents a “possibility worlds” theory of history. The protagonist, Blake Walker, discovers that this theory is true when he meets a group of men from a parallel universe who are on the hunt for a universe-travelling criminal. Travelling with them, Blake ends up in a world where Hitler won the Battle of Britain. Of course, in fiction, only worlds in which a significant change has taken place are of any real interest to the reader or viewer. (Blake also visits a world inhabited by metal dinosaurs.) The truly uncountable number of slightly different universes Everett’s theory implies are extremely difficult to get our heads around. Nonetheless, our storytelling mindsets have long primed us for a fascination with the many-worlds interpretation.

Have writers put other interpretations to good use?

For someone who really wants to put their physics degree to use in their spare time, I’d recommend the works of Greg Egan: although his novel Quarantine uses the controversial conscious collapse interpretation, he always ensures that the maths checks out. Egan’s attitude towards the scientific content of his novels is best summed up by a quote on his blog: “A few reviewers complained that they had trouble keeping straight [the science of his novel Incandescence]. This leaves me wondering if they’ve really never encountered a book that benefits from being read with a pad of paper and a pen beside it, or whether they’re just so hung up on the idea that only non-fiction should be accompanied by note-taking and diagram-scribbling that it never even occurred to them to do this.”

What other quantum concepts are widely used and abused?

We have Albert Einstein to thank for the extremely evocative description of quantum entanglement as “spooky action at a distance”. As with most scientific phenomena, a catchy nickname such as this one is extremely effective for getting a concept to stick in the popular imagination. While Einstein himself did not initially believe quantum entanglement could be a real phenomenon, as it would violate local causality, we now have both evidence and applications of entanglement in the real world, most notably in quantum cryptography. But in science fiction, the most common application of quantum entanglement is in faster-than-light communication. In her 1966 novel Rocannon’s World, Ursula K Le Guin describes a device called the “ansible”, which interstellar travellers use to instantaneously communicate with each other across vast distances. Her term was so influential that it now regularly appears in science fiction as a widely accepted name for a faster-than-light communications device, the same way we have adopted the word “robot” from the 1920 play R.U.R. by Karel Čapek.

Fiction may get the science wrong, but that is often because the story it tries to tell existed long before the science

How were cultural interpretations of entanglement influenced by the development of quantum theory?

It wasn’t until the 1970s that no-signalling theorems conclusively proved that entanglement correlations, while instantaneous, cannot be controlled or used to send messages. Explaining why is a lot more complex than communicating the notion that observing a particle here has an effect on a particle there. Once again, quantum physics seemingly provides just enough scientific justification to resolve an issue that has plagued science fiction ever since the speed of light was discovered: how can we travel through space, exploring galaxies, settling on distant planets, if we cannot communicate with each other? This same line of thought has sparked another entanglement-related invention in fiction: what if we can send not just messages but also people, or even entire spaceships, across faster-than-light distances using entanglement? Conveniently, quantum physicists had come up with another extremely evocative term that fit this idea perfectly: quantum teleportation. Real quantum teleportation only transfers information. But the idea of teleportation is so deeply embedded in our storytelling past that we can’t help extrapolating it. From stories of gods that could appear anywhere at will to tales of portals that lead to strange new worlds, we have always felt limited by the speeds of travel we have managed to achieve – and once again, the speed of light seems to be a hard limit that quantum teleportation might be able to get us around. In his 2003 novel Timeline, Michael Crichton sends a group of researchers back in time using quantum teleportation, and the videogame Half-Life 2 contains teleportation devices that similarly seem to work through quantum entanglement.

What quantum concepts have unexplored cultural potential?

Clearly, interpretations other than many worlds have a PR problem, so is anyone willing to write a chart topper based on the relational interpretation or QBism? More generally, I think that any question we do not yet have an answer to, or any theory that remains untestable, is a potential source for an excellent story. Richard Feynman famously said, “I think I can safely say that nobody understands quantum mechanics.” Ironically, it is precisely because of this that quantum physics has become such a widespread building block of science fiction: it is just hard enough to understand, just unresolved and unexplained enough to keep our hopes up that one day we might discover that interstellar communication or inter-universe travel might be possible. Few people would choose the realities of theorising over these ancient dreams. That said, the theorising may never have happened without the dreams. How many of your colleagues are intimately acquainted with the very science fiction they criticise for having unrealistic physics? We are creatures of habit and convenience held together by stories, physicists no less than everyone else. This is why we come up with catchy names for theories, and stories about dead-and-alive cats. Fiction may often get the science wrong, but that is often because the story it tries to tell existed long before the science.

A scientist in sales

Massimiliano Pindo

The boundary between industry and academia can feel like a chasm. Opportunity abounds for those willing to bridge the gap.

Massimiliano Pindo began his career working on silicon pixel detectors at the DELPHI experiment at the Large Electron–Positron Collider. While at CERN, Pindo developed analytical and technical skills that would later become crucial in his career. But despite his passion for research, doubts clouded his hopes for the future.

“I wanted to stay in academia,” he recalls. “But at that time, it was getting really difficult to get a permanent job.” Pindo moved from his childhood home in Milan to Geneva, before eventually moving back in with his parents while applying for his next research grant. “The golden days of academia where people got a fixed position immediately after a postdoc or PhD were over.”

The path forward seemed increasingly unstable, defined by short-term grants, constant travel and an inability to plan long-term. There was always a constant stream of new grant applications, but permanent contracts were few and far between. With competition increasing, job stability seemed further and further out of reach. “You could make a decent living,” Pindo says, “but the real problem was you could not plan your life.”

Translatable skills

Faced with the unpredictability of academic work, Pindo transitioned into industry – a leap that eventually led him to his current role as marketing and sales director at Renishaw, France, a global engineering and scientific technology company. Pindo was confident that his technical expertise would provide a strong foundation for a job beyond academia, and indeed he found that “hard” skills such as analytical thinking, problem-solving and a deep understanding of technology, which he had honed at CERN alongside soft skills such as teamwork, languages and communication, translated well to his work in industry.

“When you’re a physicist, especially a particle physicist, you’re used to breaking down complex problems, selecting what is really meaningful amongst all the noise, and addressing these issues directly,” Pindo says. His experience in academia gave him the confidence that industry challenges would pale in comparison. “I was telling myself that in the academic world, you are dealing with things that, at least on paper, are more complex and difficult than what you find in industry.”

Initially, these technical skills helped Pindo become a device engineer for a hardware company, before making the switch to sales. The gradual transition from academia to something more hands-on allowed him to really understand the company’s product on a technical level, which made him a more desirable candidate when transitioning into marketing.

“When you are in B2B [business-to-business] mode and selling technical products, it’s always good to have somebody who has technical experience in the industry,” explains Pindo. “You have to have a technical understanding of what you’re selling, to better understand the problems customers are trying to solve.”

However, this experience also allowed him to recognise gaps in his knowledge. As he began gaining more responsibility in his new, more business-focused role, Pindo decided to go back to university and get an MBA. During the programme, he was able to familiarise himself with the worlds of human resources, business strategy and management – skills that aren’t typically the focus in a physics lab.

Pindo’s journey through industry hasn’t been a one-way ticket out of academia. Today, he still maintains a foothold in the academic world, teaching strategy as an affiliated professor at the Sorbonne. “In the end you never leave the places you love,” he says. “I got out through the door – now I’m getting back in through the window!”

Transitioning between industry and academia was not entirely seamless. Misconceptions loomed on both sides, and it took Pindo a while to find a balance between the two.

“There is a stereotype that scientists are people who can’t adapt to industrial environments – that they are too abstract, too theoretical,” Pindo explains. “People think scientists are always in the clouds, disconnected from reality. But that’s not true. The science we make is not the science of cartoons. Scientists can be people who plan and execute practical solutions.”

The misunderstanding, he says, goes both ways. “When I talk to alumni still in academia, many think that industry is a nightmare – boring, routine, uninteresting. But that’s also false,” Pindo says. “There’s this wall of suspicion. Academics look at industry and think, ‘What do they want? What’s the real goal? Are they just trying to make more money?’ There is no trust.”

Tight labour markets

For Pindo, this divide is frustrating and entirely unnecessary. Now with years of experience navigating both worlds, he envisions a more fluid connection between academia and industry – one that leverages the strengths of both. “Industry is currently facing tight labour markets for highly skilled talent, and academia doesn’t have access to the money and practical opportunities that industry can provide,” says Pindo. “Both sides need to work together.”

To bridge this gap, Pindo advocates a more open dialogue and a revolving door between the two fields – one that allows both academics and industry professionals to move fluidly back and forth, carrying their expertise across boundaries. Both sides have much to gain from shared knowledge and collaboration. One way to achieve this, he suggests, is through active participation in alumni networks and university events, which can nurture lasting relationships and mutual understanding. If more professionals embraced this mindset, it could help alleviate the very instability that once pushed him out of academia, creating a landscape where the boundaries between science and industry blur to the benefit of both.

“Everything depends on active listening. You always have to learn from the person in front of you, so give them the chance to speak. We have a better world to build, and that comes only from open dialogue and communication.”

Hadronic decays confirm long-lived Ωc0 baryon

LHCb figure 1

In 2018 and 2019, the LHCb collaboration published surprising measurements of the Ξc0 and Ωc0 baryon lifetimes, which were inconsistent with previous results and overturned the established hierarchy between the two. A new analysis of their hadronic decays now confirms this observation, promising insights into the dynamics of baryons.

The Λc+, Ξc+, Ξc0 and Ωc0 baryons – each composed of one charm and two lighter up, down or strange quarks – are the only ground-state singly charmed baryons that decay predominantly via the weak interaction. The main contribution to this process comes from the charm quark transitioning into a strange quark, with the other constituents acting as passive spectators. Consequently, at leading order, their lifetimes should be the same. Differences arise from higher-order effects, such as W-boson exchange between the charm and spectator quarks and quantum interference between identical particles, known as “Pauli interference”. Charm hadron lifetimes are more sensitive to these effects than beauty hadrons because of the smaller charm quark mass compared to the bottom quark, making them a promising testing ground to study these effects.

Measurements of the Ξc0 and Ωc0 lifetimes prior to the start of the LHCb experiment resulted in the PDG averages shown in figure 1. The first LHCb analysis, using charm baryons produced in semi-leptonic decays of beauty baryons, was in tension with the established values, giving a Ωc0 lifetime four times larger than the previous average. The inconsistencies were later confirmed by another LHCb measurement, using an independent data set with charm baryons produced directly (prompt) in the pp collision (CERN Courier July/August 2021 p17). These results changed the ordering of the four single-charm baryons when arranged according to their lifetimes, triggering a scientific discussion on how to treat higher-order effects in decay rate calculations.

Using the full Run 1 and 2 datasets, LHCb has now measured the Ξc0 and Ωc0 lifetimes with a third independent data sample, based on fully reconstructed Ξb Ξc0 ( pKKπ+ and Ωb Ωc0 ( pKKπ+ decays. The selection of these hadronic decay chains exploits the long lifetime of the beauty baryons, such that the selection efficiency is almost independent of the charm baryon decay time. To cancel out the small remaining acceptance effects, the measurement is normalised to the kinematically and topologically similar B D0( K+Kπ+π channel, minimising the uncertainties with only a small additional correction from simulation.

The signal decays are separated from the remaining background by fits to the Ξc0 π and Ωc0 π invariant mass spectra, providing 8260 ± 100 Ξc0 and 355 ± 26 Ωc0 candidates. The decay time distributions are obtained with two independent methods: by determining the yield in each of a specific set of decay time intervals, and by employing a statistical technique that uses the covariance matrix from the fit to the mass spectra. The two methods give consistent results, confirming LHCb’s earlier measurements. Combining the three measurements from LHCb, while accounting for their correlated uncertainties, gives τ(Ξc0) = 150.7 ± 1.6 fs and τc0) = 274.8 ± 10.5 fs. These new results will serve as experimental guidance on how to treat higher-order effects in weak baryon decays, particularly regarding the approach-dependent sign and magnitude of Pauli interference terms.

Decoding the Higgs mechanism with vector bosons

CMS figure 1

The discovery of the Higgs boson at the LHC in 2012 provided strong experimental support for the Brout–Englert–Higgs mechanism of spontaneous electroweak symmetry breaking (EWSB) as predicted by the Standard Model. The EWSB explains how the W and Z bosons, the mediators of the weak interaction, acquire mass: their longitudinal polarisation states emerge from the Goldstone modes of the Higgs field, linking the mass generation of vector bosons directly to the dynamics of the process.

Yet, its ultimate origins remain un­known and the Standard Model may only offer an effective low-energy description of a more fundamental theo­ry. Exploring this possibility requires precise tests of how EWSB operates, and vector boson scattering (VBS) provides a particularly sensitive probe. In VBS, two electroweak gauge bosons scatter off one another. The cross section remains finite at high energies only because there is an exact cancellation between the pure gauge-boson interactions and the Higgs-boson mediated contributions, an effect analogous to the role of the Z boson propagator in WW production at electron–positron colliders. Deviations from the expected behaviour could signal new dynamics, such as anomalous couplings, strong interactions in the Higgs sector or new particles at higher energy scales.

This result lays the groundwork for future searches for new physics hidden within the electroweak sector

VBS interactions are among the rarest observed so far at the LHC, with cross sections as low as one femtobarn. To disentangle them from the background, researchers rely on the distinctive experimental signature of two high-energy jets in the forward detector regions produced by the initial quarks that radiate the bosons, with minimal hadronic activity between them. Using the full data set from Run 2 of the LHC at a centre-of-mass energy of 13 TeV, the CMS collaboration carried out a comprehensive set of VBS measurements across several production modes: WW (with both same and opposite charges), WZ and ZZ, studied in five final states where both bosons decay leptonically and in two semi-leptonic configurations where one boson decays into leptons and the other into quarks. To enhance sensitivity further, the data from all the measurements have now been combined in a single joint fit, with a complete treatment of uncertainty correlations and a careful handling of events selected by more than one analysis. 

All modes, one analysis

To account for possible deviations from the expected predictions, each process is characterised by a signal strength parameter (μ), defined as the ratio of the measured production rate to the cross section predicted by the Standard Model. A value of μ near unity indicates consistency with the Standard Model, while significant deviations may suggest new physics. The results, summarised in figure 1, display good agreement with the Standard Model predictions: all measured signal strengths are consistent with unity within their respective uncertainties. A mild excess with respect to the leading-order theoretical predictions is observed across several channels, highlighting the need for more accurate modelling, in particular for the measurements that have reached a level of precision where systematic effects dominate. By presenting the first evidence for all charged VBS production modes from a single combined statistical analysis, this CMS result lays the groundwork for future searches for new physics hidden within the electroweak sector.

Slovenia, Ireland and Chile tighten ties with CERN

Slovenia became CERN’s 25th Member State on 21 June, formalising a relationship of over 30 years. Full membership confers voting rights in the CERN Council and opportunities for Slovenian enterprises and citizens.

“Slovenia’s full membership in CERN is an exceptional recognition of our science and researchers,” said Igor Papič, Slovenia’s Minister of Higher Education, Science and Innovation. “Furthermore, it reaffirms and strengthens Slovenia’s reputation as a nation building its future on knowledge and science. Indeed, apart from its beautiful natural landscapes, knowledge is the only true natural wealth of our country. For this reason, we have allocated record financial resources to science, research and innovation. Moreover, we have enshrined the obligation to increase these funds annually in the Scientific Research and Innovation Activities Act.”

“On behalf of the CERN Council, I warmly welcome Slovenia as the newest Member State of CERN,” said Costas Fountas, president of the CERN Council. “Slovenia has a longstanding relationship with CERN, with continuous involvement of the Slovenian science community over many decades in the ATLAS experiment in particular.”

On 8 and 16 May, respectively, Ireland and Chile signed agreements to become Associate Member States of CERN, pending the completion of national ratification processes. They join Türkiye, Pakistan, Cyprus, Ukraine, India, Lithuania, Croatia, Latvia and Brazil as Associate Members – a status introduced by the CERN Council in 2010. In this period, the Organization has also concluded international cooperation agreements with Qatar, Sri Lanka, Nepal, Kazakhstan, the Philippines, Thailand, Paraguay, Bosnia and Herzegovina, Honduras, Bahrain and Uruguay.

Advances in very-high-energy astrophysics

Advances in Very High Energy Astrophysics: The Science Program of the Third Generation IACTs for Exploring Cosmic Gamma Rays

Imaging atmospheric Cherenkov telescopes (IACTs) are designed to detect very-high-energy gamma rays, enabling the study of a range of both galactic and extragalactic gamma-ray sources. By capturing Cherenkov light from gamma-ray-induced air showers, IACTs help trace the origins of cosmic rays and probe fundamental physics, including questions surrounding dark matter and Lorentz invariance. Since the first gamma-ray source detection by the Whipple telescope in 1989, the field has rapidly advanced through instruments like HESS, MAGIC and VERITAS. Building on these successes, the Cherenkov Telescope Array Observatory (CTAO) represents the next generation of IACTs, with greatly improved sensitivity and energy coverage. The northern CTAO site on La Palma is already collecting data, and major infrastructure development is now underway at the southern site in Chile, where telescope construction is set to begin soon.

Considering the looming start to CTAO telescope construction, Advances in Very High Energy Astrophysics, edited by Reshmi Mukherjee of Barnard College and Roberta Zanin, from the University of Barcelona, is very timely. World-leading experts tackle the almost impossible task of summarising the progress made by the third-generation IACTs: HESS, MAGIC and VERITAS.

The range of topics covered is vast, spanning the last 20 years of progress in the areas of IACT instrumentation, data-analysis techniques, all aspects of high-energy astrophysics, cosmic-ray astrophysics and gamma-ray cosmology.  The authors are necessarily selective, so the depth into each sector is limited, but I believe that the essential concepts were properly introduced and the most important highlights captured. The primary focus of the book lies in discussions surrounding gamma-ray astronomy and high-energy physics, cosmic rays and ongoing research into dark matter.

It appears, however, that the individual chapters were all written independently of each other by different authors, leading to some duplications. Source classes and high-energy radiation mechanisms are introduced multiple times, sometimes with different terminology and notation in the different chapters, which could lead to confusion for novices in the field. But though internal coordination could have been improved, a positive aspect of this independence is that each chapter is self-contained and can be read on its own. I recommend the book to emerging researchers looking for a broad overview of this rapidly evolving field.

Hadrons in Porto Alegre

The 16th International Workshop on Hadron Physics (Hadrons 2025) welcomed 135 physicists to the Federal University of Rio Grande do Sul (UFRGS) in Porto Alegre, Brazil. Delayed by four months due to a tragic flood that devastated the city, the triennial conference took place from 10 to 14 March, despite adversity maintaining its long tradition as a forum for collaboration among Brazilian and international researchers at different stages of their careers.

The workshop’s scientific programme included field theoretical approaches to QCD, the behaviour of hadronic and quark matter in astrophysical contexts, hadronic structure and decays, lattice QCD calculations, recent experimental developments in relativistic heavy-ion collisions, and the interplay of strong and electroweak forces within the Standard Model.

Fernanda Steffens (University of Bonn) explained how deep-inelastic-scattering experiments and theoretical developments are revealing the internal structure of the proton. Kenji Fukushima (University of Tokyo) addressed the theoretical framework and phase structure of strongly interacting matter, with particular emphasis on the QCD phase diagram and its relevance to heavy-ion collisions and neutron stars. Chun Shen (Wayne State University) presented a comprehensive overview of the state-of-the-art techniques used to extract the transport properties of quark–gluon plasma from heavy-ion collision data, emphasising the role of Bayesian inference and machine learning in constraining theoretical models. Li-Sheng Geng (Beihang University) explored exotic hadrons through the lens of hadronic molecules, highlighting symmetry multiplets such as pentaquarks, the formation of multi-hadron states and the role of femtoscopy in studying unstable particle interactions.

This edition of Hadrons was dedicated to the memory of two individuals who left a profound mark on the Brazilian hadronic-physics community: Yogiro Hama, a distinguished senior researcher and educator whose decades-long contributions were foundational to the development of the field in Brazil, and Kau Marquez, an early-career physicist whose passion for science remained steadfast despite her courageous battle with spinal muscular atrophy. Both were remembered with deep admiration and respect, not only for their scientific dedication but also for their personal strength and impact on the community.

Its mission is to cultivate a vibrant and inclusive scientific environment

Since its creation in 1988, the Hadrons workshop has played a central role in developing Brazil’s scientific capacity in particle and nuclear physics. Its structure facilitates close interaction between master’s and doctoral students, and senior researchers, thus enhancing both technical training and academic exchange. This model continues to strengthen the foundations of research and collaboration throughout the Brazilian scientific community.

This is the main event for the Brazilian particle- and nuclear-physics communities, reflecting a commitment to advancing research in this highly interactive field. By circulating the venue across multiple regions of Brazil, each edition further renews its mission to cultivate a vibrant and inclusive scientific environment. This edition was closed by a public lecture on QCD by Tereza Mendes (University of São Paolo), who engaged local students with the foundational questions of strong-interaction physics.

The next edition of the Hadrons series will take place in Bahia in 2028.

bright-rec iop pub iop-science physcis connect