Comsol -leaderboard other pages

Topics

Space oddities

Space Oddities

Space Oddities takes readers on a journey through the mysteries of modern physics, from the smallest subatomic particles to the vast expanse of stars and space. Harry Cliff – an experimental particle physicist at Cambridge University – unravels some of the most perplexing anomalies challenging the Standard Model (SM), with behind-the-scenes scoops from eight different experiments. The most intriguing stories concern lepton universality and the magnetic moment of the muon.

Theoretical predictions have demonstrated an extremely precise value for the muon’s magnetic moment, experimentally verified to an astonishing 11 significant figures. Over the last few years, however, experimental measurements have suggested a slight discrepancy – the devil lying in the 12th digit. 2021 measurements at Fermilab disagreed with theory predictions at 4σ. Not enough to cause a “scientific earthquake”, as Cliff puts it, but enough to suggest that new physics might be at play.

Just as everything seemed to be edging towards a new discovery, Cliff introduces the “villains” of the piece. Groundbreaking lattice–QCD predictions from the Budapest–Marseille–Wuppertal collaboration were published on the same day as a new measurement from Fermilab. If correct, these would destroy the anomaly by contradicting the data-driven theory consensus. (“Yeah, bullshit,” said one experimentalist to Cliff when put to him that the timing wasn’t intended to steal the experiment’s thunder.) The situation is still unresolved, though many new theoretical predictions have been made and a new theoretical consensus is imminent (see “Do muons wobble faster than expected“). Regardless of the outcome, Cliff emphasises that this research will pave the way for future discoveries, and none of it should be taken for granted – even if the anomaly disappears.

“One of the challenging aspects of being part of a large international project is that your colleagues are both collaborators and competitors,” Cliff notes. “When it comes to analysing the data with the ultimate goal of making discoveries, each research group will fight to claim ownership of the most interesting topics.”

This spirit of spurring collaborator- competitors on to greater heights of precision is echoed throughout Cliff’s own experience of working in the LHCb collaboration, where he studies “lepton universality”. All three lepton flavours – electron, muon and tau – should interact almost identically, except for small differences due to their masses. However, over the past decade several experimental results suggested that this theory might not hold in B-meson decays, where muons seemed to be appearing less frequently than electrons. If confirmed, this would point to physics beyond the SM.

Having been involved himself in a complementary but less sensitive analy­sis of B-meson decay channels involving strange quarks, Cliff recalls the emotional rollercoaster experienced by some of the key protagonists: the “RK” team from Imperial College London. After a year of rigorous testing, RK unblinded a sanity check of their new computational toolkit: a reanalysis of the prior measurement that yielded a perfectly consistent R value of 0.72 with an uncertainty of about 0.08, upholding a 3σ discrepancy. Now was the time to put the data collected since then through the same pasta machine: if it agreed, the tension between the SM and their overall measurement would cross the 5σ threshold. After an anxious wait while the numbers were crunched, the team received the results for the new data: 0.93 with an uncertainty of 0.09.

“Dreams of a major discovery evaporated in an instant,” recalls Cliff. “Anyone who saw the RK team in the CERN cafeteria that day could read the result from their faces.” The lead on the RK team, Mitesh Patel, told Cliff that they felt “emotionally train wrecked”.

One day we might make the right mistake and escape the claustrophobic clutches of the SM

With both results combined, the ratio averaged out to 0.85 ± 0.06, just shy of 3σ away from unity. While the experimentalists were deflated, Cliff notes that for theorists this result may have been more exciting than the initial anomaly, as it was easier to explain using new particles or forces. “It was as if we were spying the footprints of a great, unknown beast as it crashed about in a dark jungle,” writes Cliff.

Space Oddities is a great defence of irrepressible experimentation. Even “failed” anomalies are far from useless: if they evaporate, the effort required to investigate them pushes the boundaries of experimental precision, enhances collaboration between scientists across the world, and refines theoretical frameworks. Through retellings and interviews, Cliff helps the public experience the excitement of near breakthroughs, the heartbreak of failed experiments, and the dynamic interactions between theoretical and experimental physicists. Thwarting myths that physicists are cold, calculating figures working in isolation, Cliff sheds light on a community driven by curiosity, ambition and (healthy) competition. His book is a story of hope that one day we might make the right mistake and escape the claustrophobic clutches of the SM.

“I’ve learned so much from my mistakes,” read a poster above Cliff’s undergraduate tutor’s desk. “I think I’ll make another.”

Probing the quark–gluon plasma in Nagasaki

The 12th edition of the International Conference on Hard and Electromagnetic Probes attracted 346 physicists to Nagasaki, Japan, from 22 to 27 September 2024. Delegates discussed the recent experimental and theoretical findings on perturbative probes of the quark–gluon plasma (QGP) – a hot and deconfined state of matter formed in ultrarelativistic heavy-ion collisions.

The four main LHC experiments played a prominent role at the conference, presenting a large set of newly published results from studies performed on data collected during LHC Run 2, as well as several new preliminary results performed on the new data samples from Run 3.

Jet modifications

A number of significant results on the modification of jets in heavy-ion collisions were presented. Splitting functions characterising the evolution of parton showers are expected to be modified in the presence of the QGP, providing experimental access to the medium properties. A more differential look at these modifications was presented through a correlated measurement of the shared momentum fraction and opening angle of the first splitting satisfying the “soft drop” condition in jets. Additionally, energy–energy correlators have recently emerged as promising observables where the properties of jet modification in the medium might be imprinted at different scales on the observable.

The first measurements of the two-particle energy–energy correlators in p–Pb and Pb–Pb collisions were presented, showing modifications in both the small- and large-angle correlations for both systems compared to pp collisions. A long-sought after effect of energy exchanges between the jet and the medium is a correlated response of the medium in the jet direction. For the first time, measurements of hadron–boson correlations in events containing photons or Z bosons showed a clear depletion of the bulk medium in the direction of the Z boson, providing direct evidence of a medium response correlated to the propagating back-to-back jet. In pp collisions, the first direct measurement of the dead cone of beauty quarks, using novel machine-learning methods to reconstruct the beauty hadron from partial decay information, was also shown.

Several new results from studies of particle production in ultraperipheral heavy-ion collisions were discussed. These studies allow us to investigate the possible onset of gluon saturation at low Bjorken-x values. In this context, new results of charm photoproduction, with measurements of incoherent and coherent J/ψ mesons, as well as of D0 mesons, were released. Photonuclear production cross-sections of di-jets, covering a large interval of photon energies to scan over different regions of Bjorken-x, were also presented. These measurements pave the way for setting constraints on the gluon component of nuclear parton distribution functions at low Bjorken-x values, over a wide Q2 range, in the absence of significant final-state effects.

New experiments will explore higher-density regions of the QCD–matter phase diagram

During the last few years, a significant enhancement of charm and beauty-baryon production in proton–proton collisions was observed, compared to measurements in e+e and ep collisions. These observations have challenged the assumption of the universality of heavy-quark fragmentation across different collision systems. Several intriguing measurements on this topic were released at the conference. In addition to an extended set of charm meson-to-meson and baryon-to-meson production yield ratios, the first measurements of the production of Σc0,++(2520) relative to Σc0,++(2455) at the LHC, obtained exploiting the new Run 3 data samples, were discussed. New insights on the structure of the exotic χc1(3872) state and its hadronisation mechanism were garnered by measuring the ratio of its production yield to that of ψ(2S) mesons in hadronic collisions.

Additionally, strange-to-non-strange production-yield ratios for charm and beauty mesons as a function of the collision multiplicity were released, pointing toward an enhanced strangeness production in a higher colour-density environment. Several theoretical approaches implementing modified hadronisation mechanisms with respect to in-vacuum fragmentation have proven to be able to reproduce at least part of the measurements, but a comprehensive description of the heavy-quark hadronisation, in particular for the baryonic sector, is still to be reached.

A glimpse into the future of the experimental opportunities in this field was also provided. A new and intriguing set of physics observables for a complete characterisation of the QGP with hard probes will become accessible with the planned upgrades of the ALICE, ATLAS, CMS and LHCb detectors, both during the next long LHC shutdown and in the more distant future. New experiments at CERN, such as NA60+, or in other facilities like the Electron–Ion Collider in the US and J-PARC-HI in Japan, will explore higher-density regions of the QCD–matter phase diagram.

The next edition of this conference series is scheduled to be held in Nashville, US, from 1 to 5 June 2026.

Encounters with artists

Why should scientists care about art?

Throughout my experiences in the laboratory, I have seen how art is an important part of a scientist’s life. By being connected with art, scientists recognise that their activities are very embedded in contemporary culture. Science is culture. Through art and dialogues with artists, people realise how important science is for society and for culture in general. Science is an important cultural pillar in our society, and these interactions bring scientists meaning.

Are science and art two separate cultures?

Today, if you ask anyone: “What is nature?” they describe everything in scientific terms. The way you describe things, the mysteries of your research: you are actually answering the questions that are present in everyone’s life. In this case, scientists have a sense of responsibility. I think art helps to open this dialogue from science into society.

Do scientists have a responsibility to communicate their research?

All of us have a social responsibility in everything we produce. Ideas don’t belong to anyone, so it’s a collective endeavour. I think that scientists don’t have the responsibility to communicate the research themselves, but that their research cannot be isolated from society. I think it’s a very joyful experience to see that someone cares about what you do.

Why should artists care about science?

If you go to any academic institution, there’s always a scientific component, very often also a technological one. A scientific aspect of your life is always present. This is happening because we’re all on the same course. It’s a consequence of this presence of science in our culture. Artists have an important role in our society, and they help to spark conversations that are important to everyone. Sometimes it might seem as though they are coming from a very individual lens, but in fact they have a very large reach and impact. Not immediately, not something that you can count with data, but there is definitely an impact. Artists open these channels for communicating and thinking about a particular aspect of science, which is difficult to see from a scientific perspective. Because in any discipline, it’s amazing to see your activity from the eyes of others.

Creativity and curiosity are the parameters and competencies that make up artists and scientists

A few years back we did a little survey, and most of the scientists thought that by spending time with artists, they took a step back to think about their research from a different lens, and this changed their perspective. They thought of this as a very positive experience. So I think art is not only about communicating to the public, but about exploring the personal synergies of art and science. This is why artists are so important.

Do experimental and theoretical physicists have different attitudes towards art?

Typically, we think that theorists are much more open to artists, but I don’t agree. In my experiences at CERN, I found many engineers and experimental physicists being highly theoretical. Both value artistic perspectives and their ability to consider questions and scientific ideas in an unconventional way. Experimental physicists would emphasise engagement with instruments and data, while theoretical physicists would focus on conceptual abstraction.

By being with artists, many experimentalists feel that they have the opportunity to talk about things beyond their research. For example, we often talk about the “frontiers of knowledge”. When asked about this, experimentalists or theoretical physicists might tell us about something other than particle physics – like neuroscience, or the brain and consciousness. A scientist is a scientist. They are very curious about everything.

Do these interactions help to blur the distinction between art and science?

Well, here I’m a bit radical because I know that creativity is something we define. Creativity and curiosity are the parameters and competencies that make up artists and scientists. But to become a scientist or an artist you need years of training – it’s not that you can become one just because you are a curious and creative person.

Chroma VII work of art

Not many people can chat about particle physics, but scientists very often chat with artists. I saw artists speaking for hours with scientists about the Higgs field. When you see two people speaking about the same thing, but with different registers, knowledge and background, it’s a precious moment.

When facilitating these discussions between physicists and artists, we don’t speak only about physics, but about everything that worries them. Through that, grows a sort of intimacy that often becomes something else: a friendship. This is the point at which a scientist stops being an information point for an artist and becomes someone who deals with big questions alongside an artist – who is also a very knowledgeable and curious person. This is a process rich in contrast, and you get many interesting surprises out of these interactions.

But even in this moment, they are still artists and scientists. They don’t become this blurred figure that can do anything.

Can scientific discovery exist without art?

That’s a very tricky question. I think that art is a component of science, therefore science cannot exist without art – without the qualities that the artist and scientist have in common. To advance science, you have to create a question that needs to be answered experimentally.

Did discoveries in quantum mechanics affect the arts?

Everything is subjected to quantum mechanics. Maybe what it changed was an attitude towards uncertainty: what we see and what we think is there. There was an increased sense of doubt and general uncertainty in the arts.

Do art and science evolve together or separately?

I think there have been moments of convergence – you can clearly see it in any of the avant garde. The same applies to literature; for example, modernist writers showed a keen interest in science. Poets such as T S Eliot approached poetry with a clear resonance of the first scientific revolutions of the century. There are references to the contributions of Faraday, Maxwell and Planck. You can tell these artists and poets were informed and eager to follow what science was revealing about the world.

You can also note the influence of science in music, as physicists get a better understanding of the physical aspects of sound and matter. Physics became less about viewing the world through a lens, and instead focused on the invisible: the vibrations of matter, electricity, the innermost components of materials. At the end of the 19th and 20th centuries, these examples crop up constantly. It’s not just representing the world as you see it through a particular lens, but being involved in the phenomena of the world and these uncensored realities.

From the 1950s to the 1970s you can see these connections in every single moment. Science is very present in the work of artists, but my feeling is that we don’t have enough literature about it. We really need to conduct more research on this connection between humanities and science.

What are your favourite examples of art influencing science?

Feynman diagrams are one example. Feynman was amazing – a prodigy. Many people before him tried to represent things that escaped our intuition visually and failed. We also have the Pauli Archives here at CERN. Pauli was not the most popular father of quantum mechanics, but he was determined to not only understand mathematical equations but to visualise them, and share them with his friends and colleagues. This sort of endeavour goes beyond just writing – it is about the possibility of creating a tangible experience. I think scientists do that all the time by building machines, and then by trying to understand these machines statistically. I see that in the laboratory constantly, and it’s very revealing because usually people might think of these statistics as something no one cares about – that the visuals are clumsy and nerdy. But they’re not.

Even Leonardo da Vinci was known as a scientist and an artist, but his anatomical sketches were not discovered until hundreds of years after his other works. Newton was also paranoid about expressing his true scientific theories because of the social standards and politics of the time. His views were unorthodox, and he did not want to ruin his prestigious reputation.

Today’s culture also influences how we interpret history. We often think of Aristotle as a philosopher, yet he is also recognised for contributions to natural history. The same with Democritus, whose ideas laid foundations for scientific thought.

So I think that opening laboratories to artists is very revealing about the influence of today’s culture on science.

When did natural philosophy branch out into art and science?

I believe it was during the development of the scientific method: observation, analysis and the evolution of objectivity. The departure point was definitely when we developed a need to be objective. It took centuries to get where we are now, but I think there is a clear division: a line with philosophy, natural philosophy and natural history on one side, and modern science on the other. Today, I think art and science have different purposes. They convene at different moments, but there is always this detour. Some artists are very scientific minded, and some others are more abstract, but they are both bound to speculate massively.

Its really good news for everyone that labs want to include non-scientists

For example, at our Arts at CERN programme we have had artists who were interested in niche scientific aspects. Erich Berger, an artist from Finland, was interested in designing a detector, and scientists whom he met kept telling him that he would need to calibrate the detector. The scientist and the artist here had different goals. For the scientist, the most important thing is that the detector has precision in the greatest complexity. And for the artist, it’s not. It’s about the process of creation, not the analysis.

Do you think that science is purely an objective medium while art is a subjective one?

No. It’s difficult to define subjectivity and objectivity. But art can be very objective. Artists create artefacts to convey their intended message. It’s not that these creations are standing alone without purpose. No, we are beyond that. Now art seeks meaning that is, in this context, grounded in scientific and technological expertise.

How do you see the future of art and science evolving?

There are financial threats to both disciplines. We are still in this moment where things look a bit bleak. But I think our programme is pioneering, because many scientific labs are developing their own arts programmes inspired by the example of Arts at CERN. This is really great, because unless you are in a laboratory, you don’t see what doing science is really about. We usually read science in the newspapers or listen to it on a podcast – everything is very much oriented to the communication of science, but making science is something very specific. It’s really good news for everyone that laboratories want to include non-scientists. Arts at CERN works mostly with visual artists, but you could imagine filmmakers, philosophers, those from the humanities, poets or almost anyone at all, depending on the model that one wants to create in the lab.

Breaking new ground in flavour universality

LHCb figure 1

A new result from the LHCb collaboration supports the hypothesis that the rare decays B± K±e+e and B± K±µ+µoccur at the same rate, further tightening constraints on the magnitude of lepton flavour universality (LFU) violation in rare B decays. The new measurement is the most precise to date in the high-q2 region and the first of its kind at a hadron collider.

LFU is an accidental symmetry of the Standard Model (SM). Under LFU, each generation of lepton ℓ± (electron, muon and tau lepton) is equally likely to interact with the W boson in decay processes such as B± K±+. This symmetry leads to the prediction that the ratio of branching fractions for these decay channels should be unity except for kinematic effects due to the different masses of the charged leptons. The most straightforward ratio to measure is that between the muon and electron decay modes, known as RK. Any significant deviation from RK = 1 could only be explained by the existence of new physics (NP) particles that preferentially couple to one lepton generation over another, violating LFU.

B± K±+ decays are a powerful probe for virtual NP particles. These decays involve an underlying b–to–s quark transition – an example of a flavour-changing neutral current (FCNC). FCNC transitions are extremely rare in the SM, as they occur only through higher-order Feynman diagrams. This makes them particularly sensitive to contributions from NP particles, which could significantly alter the characteristics of the decays. In this case, the mass of the NP particles could be much larger than can be produced directly at the LHC. “Indirect” searches for NP, such as measuring the precisely predicted ratio RK, can probe mass scales beyond the reach of direct-production searches with current experimental resources.

The new measurement is the most precise to date in the high-q2 region

In the decay process B± K±+, the final-state leptons can also originate from an intermediate resonant state, such as a J/ψ or ψ(2S). These resonant channels occur through tree-level Feynman diagrams. Their contributions significantly outnumber the non-resonant FCNC processes and are not expected to be affected by NP. RK is therefore measured in ranges of dilepton invariant mass-squared (q2), which exclude these resonances, to preserve sensitivity to potential NP effects in FCNC processes.

The new result from the LHCb collaboration measures RK in the high-q2 region, above the ψ(2S) resonance. The high-q2 region data has a different composition of backgrounds compared to the low-q2 data, leading to different strategies for their rejection and modelling, and different systematic effects. With RK expected to be unity in all domains in the SM, low-q2 and high-q2 measurements offer powerfully complementary constraints on the magnitude of LFU-violating NP in rare B decays.

The new measurement of RK agrees with the SM prediction of unity and is the most precise to date in the high-q2 region (figure 1). It complements a refined analysis below the J/ψ resonance published by LHCb in 2023, which also reported RK consistent with unity. Both results use the complete proton–proton collision data collected by LHCb from 2011 to 2018. They lay the groundwork for even more precise measurements with data from Run 3 and beyond.

A new record for precision on B-meson lifetimes

ATLAS figure 1

As direct searches for physics beyond the Standard Model continue to push frontiers at the LHC, the b-hadron physics sector remains a crucial source of insight for testing established theoretical models.

The ATLAS collaboration recently published a new measurement of the B0 lifetime using B0 J/ψK*0 decays from the entire Run-2 dataset it has recorded at 13 TeV. The result improves the precision of previous world-leading measurements by the CMS and LHCb collaborations by a factor of two.

Studies of b-hadron lifetimes probe our understanding of the weak interaction. The lifetimes of b-hadrons can be systematically computed within the heavy-quark expansion (HQE) framework, where b-hadron observables are expressed as a perturbative expansion in inverse powers of the b-quark mass.

ATLAS measures the “effective” B0 lifetime, which represents the average decay time incorporating effects from mixing and CP contributions, as τ(B0) = 1.5053 ± 0.0012 (stat.) ± 0.0035 (syst.) ps. The result is consistent with previous measurements published by ATLAS and other experiments, as summarised in figure 1. It also aligns with theoretical predictions from HQE and lattice QCD, as well as with the experimental world average.

The analysis benefitted from the large Run-2 dataset and a refined trigger selection, enabling the collection of an extensive sample of 2.5 million B0 J/ψK*0 decays. Events with a J/ψ meson decaying into two muons with sufficient transverse momentum are cleanly identified in the ATLAS Muon Spectrometer by the first-level hardware trigger. In the next-level software trigger, exploiting the full detector information, these muons are then combined with two tracks measured by the Inner Detector, ensuring they originate from the same vertex.

The B0-meson lifetime is determined through a two-dimensional unbinned maximum-likelihood fit, utilising the measured B0-candidate mass and decay time, and accounting for both signal and background components. The limited hadronic particle-identification capability of ATLAS requires careful modelling of the significant backgrounds from other processes that produce J/ψ mesons. The sensitivity of the fit is increased by estimating the uncertainty of the decay-time measurement provided by the ATLAS tracking and vertexing algorithms on a per-candidate basis. The resulting lifetime measurement is limited by systematic uncertainties, with the largest contributions arising from the correlation between B0 mass and lifetime, and ambiguities in modelling the mass distribution. 

ATLAS combined its measurement with the average decay width (Γs) of the light and heavy Bs-meson mass eigenstates, also measured by ATLAS, to determine the ratio of decay widths as Γd/Γs = 0.9905 ± 0.0022 (stat.) ± 0.0036 (syst.) ± 0.0057 (ext.). The result is consistent with unity and provides a stringent test of QCD predictions, which also support a value near unity.

Beyond Bohr and Einstein

When I was an undergraduate physics student in the mid-1980s, I fell in love with the philosophy of quantum mechanics. I devoured biographies of the greats of early-20th-century atomic physics – physicists like Bohr, Heisenberg, Schrödinger, Pauli, Dirac, Fermi and Born. To me, as I was struggling with the formalism of quantum mechanics, there seemed to be something so exciting, magical even, about that era, particularly those wonder years of the mid-1920s when its mathematical framework was being developed and the secrets of the quantum world were revealing themselves.

I went on to do a PhD in nuclear reaction theory, which meant I spent most of my time working through mathema­tical derivations, becoming familiar with S-matrices, Green’s functions and scattering amplitudes, scribbling pages of angular-momentum algebra and coding in Fortran 77. And I loved that stuff. There certainly seemed to be little time for worrying about what was really going on inside atomic nuclei. Indeed, I was learning that even the notion of something “really going on” was a vague one. My generation of theoretical physicists were still being very firmly told to “shut up and calculate”, as many adherents of the Copenhagen school of quantum mechanics were keen to advocate. To be fair, so much progress has been made over the past century, in nuclear and particle physics, quantum optics, condensed-matter physics and quantum chemistry, that philosophical issues were seen as an unnecessary distraction. I recall one senior colleague, frustrated by my abiding interest in interpretational matters, admonishing me with: “Jim, an electron is an electron is an electron. Stop trying to say more about it.” And there certainly seemed to be very little in the textbooks I was reading about unresolved issues arising from such topics as the EPR (Einstein–Podolsky–Rosen) paradox and the measurement problem, let alone any analysis of the work of Hugh Everett and David Bohm, who were regarded as mavericks. The Copenhagen hegemony ruled supreme.

What I wasn’t aware of until later in my career was that a community of physicists had indeed continued to worry and think about such matters. These physicists were doing more than just debating and philosophising – they were slowly advancing our understanding of the quantum world. Experimentalists such as Alain Aspect, John Clauser and Anton Zeilinger were devising ingenious experiments in quantum optics – all three of whom were only awarded the Nobel Prize for their work on tests of John Bell’s famous inequality in 2022, which says a lot about how we are only now acknowledging their contribution. Meanwhile, theorists such as Wojciech Zurek, Erich Joos, Deiter Zeh, Abner Shimony and Asher Peres, to name just a few, were formalising ideas on entanglement and decoherence theory. It is certainly high time that quantum-mechanics textbooks – even advanced undergraduate ones – should contain their new insights.

Quantum Drama

All of which brings me to Quantum Drama, a new popular-science book and collaboration between the physicist and science writer Jim Baggott and the late historian of science John L Heilbron. In terms of level, the book is at the higher end of the popular-science market and, as such, will probably be of most interest to, for example, readers of CERN Courier. If I have a criticism of the book it is that its level is not consistent. For it tries to be all things. On occasion, it has wonderful biographical detail, often of less well-known but highly deserving characters. It is also full of wit and new insights. But then sometimes it can get mired in technical detail, such as in the lengthy descriptions of the different Bell tests, which I imagine only professional physicists are likely to fully appreciate.

Having said that, the book is certainly timely. This year the world celebrates the centenary of quantum physics, since the publication of the momentous papers of Heisenberg and Schrödinger on matrix and wave mechanics, in 1925 and 1926, respectively. Progress in quantum information theory and in the development of new quantum technologies is also gathering pace right now, with the promise of quantum computers, quantum sensing and quantum encryption getting ever closer. This all provides an opportunity for the philosophy of quantum mechanics to finally emerge from the shadows into mainstream debate again.

A new narrative

So, what makes Quantum Drama stand out from other books that retell the story of quantum mechanics? Well, I would say that most historical accounts tend to focus only on that golden age between 1900 and 1927, which came to an end at the Solvay Conference in Brussels and those well-documented few days when Einstein and Bohr had their debate about what it all means. While these two giants of 20th-century physics make the front cover of the book, Quantum Drama takes the story on beyond that famous conference. Other accounts, both popular and scholarly, tend to push the narrative that Bohr won the argument, leaving generations of physicists with the idea that the interpretational issues had been resolved – apart that is, from the odd dissenting voices from the likes of Everett or Bohm who tried, unsuccessfully it was argued, to put a spanner in the Copenhagen works. All the real progress in quantum foundations after 1927, or so we were told, was in the development of quantum field theories, such as QED and QCD, the excitement of high-energy physics and the birth of the Standard Model, with the likes of Murray Gell-Mann and Steven Weinberg replacing Heisenberg and Schrödinger at centre stage. Quantum Drama takes up the story after 1927, showing that there has been a lively, exciting and ongoing dispute over what it all means, long after the death of those two giants of physics. In fact, the period up to Solvay 1927 is all dealt with in Act I of the book. The subtitle puts it well: From the Bohr–Einstein Debate to the Riddle of Entanglement.

The Bohr–Einstein debate is still very much alive and kicking

All in all, Quantum Drama delivers something remarkable, for it shines a light on all the muddle, complexity and confusion surrounding a century of debate about the meaning of quantum mechanics and the famous “Copenhagen spirit”, treating the subject with thoroughness and genuine scholarship, and showing that the Bohr–Einstein debate is still very much alive and kicking.

Guido Barbiellini 1936–2024

Guido Barbiellini

Guido Barbiellini Amidei, who passed away on 15 November 2024, made fundamental contributions to both particle physics and astrophysics.

In 1959 Guido earned a degree in physics from Rome University with a thesis on electron bremsstrahlung in monocrystals under Giordano Diambrini, a skilled experimentalist and excellent teacher. Another key mentor was Marcello Conversi, spokesperson for one of the detectors at the Adone electron–positron collider at INFN Frascati, where Guido became a staff member and developed the first luminometer based on small-angle electron–positron scattering – a technique still used today. Together with Shuji Orito, he also built the first double-tagging system for studying gamma-ray collisions.

Guido later spent several years at CERN, collaborating with Carlo Rubbia, first on the study of K-meson decays at the Proton Synchrotron and then on small-angle proton–proton scattering at the Intersecting Storage Rings. In 1974 he proposed an experiment in a new field for him: neutrino-electron scattering, a fundamental but extremely rare phenomenon known from a handful of events seen in Gargamelle. To distinguish electromagnetic showers from hadronic ones, the CHARM collaboration built a “light” calorimeter made of 150 tonnes of Carrara marble. From 1979 to 1983, 200 electron–neutrino scattering events were recorded.

In 1980 Guido remarked to his friend Ugo Amaldi: “Why don’t we start our own collaboration for LEP instead of joining others?” This suggestion sparked the genesis of the DELPHI collaboration, in which Guido played a pivotal role in defining its scientific objectives and overseeing the construction of the barrel electromagnetic calorimeter. He also contributed significantly to the design of the luminosity monitors. Above all, Guido was a constant driving force within the experiment, offering innovative ideas for fundamental physics during the transition to LEP’s higher-energy phase, and engaging tirelessly with both young students and senior colleagues.

Guido’s insatiable scientific curiosity also extended to CP symmetry violation. In 1989 he co-organised a workshop, with Konrad Kleinknecht and Walter Hoogland, exploring the possibility of an electron–positron ϕ-factory to study CP violation in neutral kaon decays. Two of his papers, with Claudio Santoni, laid the groundwork for constructing the DAΦNE collider in Frascati.

The year 1987 was a turning point for Guido. Firstly, he became a professor at the University of Trieste. Secondly, the detection of neutrinos produced by Supernova 1987A inspired a letter, published in Nature in collaboration with Giuseppe Cocconi, in which it was established that neutrinos have a charge smaller than 10–17 elementary charges. Thirdly, Guido presented a new idea to mount silicon detectors (which he had encountered through work done in DELPHI by Bernard Hyams and Peter Weilhammer) on the International Space Station or a spacecraft to detect cosmic rays and their showers, which led to a seminal paper.

At the beginning of the 1990s, an international collaboration for a large NASA space mission focused on gamma-ray astrophysics (initially named GLAST) began to form, led by SLAC scientists. Guido was among the first proponents and later was the national representative of many INFN groups. The mission, later renamed Fermi, was launched in 2008 and continues to produce significant insights in topics ranging from neutron stars and black holes to dark-matter annihilation.

Beyond GLAST, Guido was captivated by the application of silicon sensors to a new programme of small space missions initiated by the Italian Space Agency. The AGILE gamma-ray astrophysics mission, for which Guido was co-principal investigator, was conceived and approved during this period. Launched in 2007, AGILE made numerous discoveries over nearly 17 years, including identifying the origin of hadronic cosmic rays in supernova remnants and discovering novel, rapid particle acceleration phenomena in the Crab Nebula.

Guido’s passion for physics made him inexhaustible. He always brought fresh insights and thoughtful judgments, fostering a collaborative environment that enriched all the projects he took part in. He was not only a brilliant physicist but also a true gentleman of calm and mild manners, widely appreciated as a teacher and as director of INFN Trieste. Intellectually free and always smiling, he conveyed determination and commitment with grace and a profound dedication to nurturing young talents. He will be deeply missed.

Meinhard Regler 1941–2024

Meinhard Regler

Meinhard Regler, an expert in detector development and software analysis, passed away on 22 September 2024 at the age of 83.

Born and raised in Vienna, Meinhard studied physics at the Technical University Vienna (TUW) and completed his master’s thesis on deuteron acceleration in a linac at CERN. In 1966 he joined the newly founded Institute of High Energy Physics (HEPHY) of the Austrian Academy of Sciences. He settled in Geneva to participate in a counter experiment at the CERN Proton Synchrotron, and in 1970 obtained his PhD with distinction from TUW.

In 1970 Meinhard became staff member in CERN’s data-handling division. He joined the Split Field Magnet experiment at the Intersecting Storage Rings and, together with HEPHY, contributed specially designed multi-wire proportional chambers. Early on, he realised the importance of rigorous statistical methods for track and vertex reconstruction in complex detectors, resulting in several seminal papers.

In 1975 Meinhard returned to Vienna as leader of HEPHY’s experimental division. From 1993 until his retirement at the end of 2006 he was deputy director and responsible for the detector development and software analysis groups. As a faculty member of TUW he created a series of specialised lectures and practical courses, which shaped a generation of particle physicists. In 1978 Meinhard and Georges Charpak founded the Wire Chamber Conference, now known as the Vienna Conference on Instrumentation (VCI).

Meinhard continued his participation in experiments at CERN, including WA6, UA1 and the European Hybrid Spectrometer. After joining the DELPHI experiment at LEP, he realised the emerging potential of semiconductor tracking devices and established this technology at HEPHY. First applied at DELPHI’s Very Forward Tracker, this expertise was successfully continued with important contributions to the CMS tracker at LHC, the Belle vertex detector at KEKB and several others.

Meinhard is author and co-author of several hundred scientific papers. His and his group’s contributions to track and vertex reconstruction are summarised in the standard textbook Data Analysis Techniques for High-Energy Physics, published by Cambridge University Press and translated into Russian and Chinese.

All that would suffice for a lifetime achievement, but not so for Meinhard. Inspired by the fall of the Iron Curtain, he envisaged the creation of an international centre of excellence in the Vienna region. Initially planned as a spallation neutron source, the project eventually transmuted into a facility for cancer therapy by proton and carbon-ion beams, called MedAustron. Financed by the province of Lower Austria and the hosting city of Wiener Neustadt, and with crucial scientific and engineering support from CERN and Austrian institutes, clinical treatment started in 2016.

Meinhard received several prizes and was rewarded with the highest scientific decoration of Austria

Meinhard was invited as a lecturer to many international conferences and post-graduate schools worldwide. He chaired the VCI series, organised several accelerator schools and conferences in Austria, and served on the boards of the European Physical Society’s international group on accelerators. For his tireless scientific efforts and in particular the realisation of MedAustron, Meinhard received several prizes and was rewarded with the highest scientific decoration of Austria – the Honorary Cross for Science and Arts of First Class.

He was also a co-founder and long-term president of a non-profit organisation in support of mentally handicapped people. His character was incorruptible, strictly committed to truth and honesty, and responsive to loyalty, independent thinking and constructive criticism.

In Meinhard Regler we have lost an enthusiastic scientist, visionary innovator, talented organiser, gifted teacher, great humanist and good friend. His legacy will forever stay with us.

Iosif Khriplovich 1937–2024

Renowned Soviet/Russian theorist Iosif Khriplovich passed away on 26 September 2024, aged 87. Born in 1937 in Ukraine to a Jewish family, he graduated from Kiev University and moved to the newly built Academgorodok in Siberia. From 1959 to 2014 he was a prominent member of the theory department at the Budker Institute of Nuclear Physics. He combined his research with teaching at Novosibirsk University, where he also held a professorship in 1983–2009. In 2014 he moved to St. Petersburg to take up a professorial position at Petersburg University and was a corresponding member of the Russian Academy of Sciences from 2000.

In a paper published in 1969, Khriplovich was the first to discover the phenomenon of anti-screening in the SU(2) Yang–Mills theory by calculating the first loop correction to the charge renormalisation. This immediately translates into the crucial first coefficient (–22/3) of the Gell-Mann–Low function and asymptotic freedom of the theory.

Regretfully, Khriplovich did not follow this interpretation of his result even after the key SLAC experiment on deep inelastic scattering and its subsequent partonic interpretation by Feynman. The honour of the discovery of asymptotic freedom in QCD went to three authors of papers published in 1973, who seemingly did not know of Khriplo­vich’s calculations.

In the early 1970s, Khriplovich’s interests turned to fundamental questions on the way towards the Standard Model. One was whether the electroweak theory is described by the Weinberg–Salam model, with neutral currents interacting via Z bosons, or the Georgi–Glashow model without them. While neutrino scattering on nucleons was soon confirmed, the electron interaction with nucleons was still unchecked. One practical way to find out was to use atomic spectroscopy to look for any mixing between states of opposite parity. Actively entering this area, Khriplovich and his students worked out quantitative predictions for the rotation of laser polarisation due to the weak interaction between electrons and nucleons. Their predictions were triumphantly confirmed in experiments, firstly by Barkov and Zolotorev at the Budker Institute. The same parity violating interaction was later observed at SLAC in 1978, proving the Z-exchange and the Weinberg–Salam model beyond any doubt. In 1973, together with Arkady Vainshtein, Khriplovich also derived the first solid limit on the mass of the charm quark that was unexpectedly discovered the following year.

He became engaged in Yang–Mills theories at a time when very few people were interested in them

The work of Khriplovich and his group significantly advanced the theory of many-electron atoms and contributed to the subsequent studies of the violation of fundamental symmetries in processes involving elementary particles, atoms, molecules and atomic nuclei. His students and later close collaborators, such as Victor Flambaum, Oleg Sushkov and Maxim Pospelov, grew as strong physicists who made important contributions to various subfields of theoretical physics. He was awarded the Silver Dirac Medal by the University of New South Wales (Sydney) and the Pomeranchuk Prize by the Institute of Theoretical and Experimental Physics (Moscow).

Yulik, as he was affectionately known, had his own style in physics. He was feisty and focused on issues where he could become a trailblazer, unafraid to cut relations with scientists of any rank if he felt their behaviour did not match his high ethical standards. This is why he became engaged in Yang–Mills theories at a time when very few people were interested in them. Yet, Yulik was always graceful and respectful in his interactions with others, and smiling, as we would like to remember him.

Strategy symposium shapes up

Registration is now open for the Open Symposium of the 2026 update to the European Strategy for Particle Physics (ESPP). It will take place from 23 to 27 June at Lido di Venezia in Italy, and see scientists from around the world debate the inputs to the ESPP (see “A call to engage”).

The symposium will begin by surveying the implementation of the last strategy process, whose recommendations were approved by the CERN Council in June 2020. In-depth working-group discussions on all areas of physics and technology will follow.

The rest of the week will see plenary sessions on the different physics and technology areas, starting with various proposals for possible large accelerator projects at CERN, and the status and plans in other regions of the world. Open questions, as well as how they can be addressed by the proposed projects, will be presented in rapporteur talks. This will be followed by longer discussion blocks where the full community can get engaged. On the final day, members of the European Strategy Group will summarise the national inputs and other overarching topics to the ESPP.

bright-rec iop pub iop-science physcis connect