Comsol -leaderboard other pages

Topics

Muons cooled and accelerated in Japan

In a world first, a research group working at the J-PARC laboratory in Tokai, Japan, has cooled and accelerated a beam of antimatter muons (µ+). Though muon cooling was first demonstrated by the Muon Ionisation Cooling Experiment in the UK in 2020 (CERN Courier March/April 2020 p7), this is the first time that the short-lived cousins of the electron have been accelerated after cooling – an essential step for applications in particle physics.

The cooling method is ingenious – and completely different to ionisation cooling, where muons are focused in absorbers to reduce their transverse momentum. Instead, µ+ are slowed to 0.002% of the speed of light in a thin silica-aerogel target, capturing atomic electrons to form muonium, an atom-like compound of an antimatter muon and an electron. Experimenters then ionise the muonium using a laser to create a near monochromatic beam that is reaccelerated in radiofrequency (RF) cavities. The work builds on the acceleration of negative muonium ions – an antimatter muon bonded to two electrons – which the team demonstrated in 2017 (CERN Courier July/August 2018 p8).

Though the analysis is still to be finalised, with results due to be published soon, the cooling and acceleration effect is unmistakable. In accelerator physics, cooling is traditionally quantified by a reduction in beam emittance – an otherwise conserved quantity that reflects the volume occupied by the beam in the abstract space of orthogonal displacements and momenta. Estimates indicate a beam cooling effect of more than an order of magnitude, with the beam then accelerated from 25 meV to 100 keV. The main challenge is transmission. At present one antimatter muon emerges from the RF for every 10 million, which impact the aerogel. Muon decay is also a challenge given that the muonium is nearly stationary in the laboratory frame, with time dilation barely extending the muon’s 2.2 μs lifetime. Roughly a third of the µ+ decay before exiting the J-PARC apparatus.

The first application of this technology will be the muon g-2/EDM experiment at J-PARC, where data taking is due to start in 2028. The experiment will add valuable data points to measurements thought to have exceptional sensitivity to new physics (CERN Courier May/June 2021 p25). In the case of the anomalous magnetic moment (g-2) of the muon, theoretical showdowns later this year may either dissipate or reinforce intriguing hints of beyond-the-Standard-Model physics from the Muon g-2 experiment at Fermilab, potentially adding strong motivation to an independent test.

We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success

“Although our current focus is the muon g-2/EDM experiment, we are open to any possible applications of this technology in the future,” says spokesperson Tsutomu Mibe of KEK. “We are communicating with experts to understand if our technology is of any use in a muon collider, but note that our method cannot be adapted for negative muons.”

While proposals for a µ+µ+ or µ+e collider exist, a µ+µ collider remains the most strongly motivated machine. “Much of the physics interest in e+e and µ+µ colliders comes from the annihilations of the initial particles into a photon and/or a Z boson, or a Higgs boson in the case of µ+µ,” says John Ellis of CERN/KCL. “These possibilities are absent for a µ+e or µ+µ+ collider, making them less interesting in my opinion.” From an accelerator-physics perspective, it remains to be demonstrated that the technique can deliver the beam intensity needed for an energy-frontier collider – not least while keeping the emittance low.

“We are very impressed with the progress of our colleagues at J-PARC and congratulate them on their success, says International Muon Collider study leader Daniel Schulte of CERN. “This will profit the development of muon-beam technology and use. We are in contact to understand how we can collaborate.”

The next 10 years in astroparticle theory

Pulsar timing arrays

Astroparticle physics connects the extremely small with the extremely large. At the interface of particle physics, cosmology and astronomy, the field ties particles and interactions to the hot Big Bang cosmological model. This synergy allows us to go far beyond the limitations of terrestrial probes in our quest to understand nature at its most fundamental level. A typical example is neutrino masses, where cosmological observations from large-scale structure formation far exceed current bounds from terrestrial experiments. Astroparticle theory (APT) has accelerated quickly in the past 10 years. And this looks certain to continue in the next 10.

Today, neutrino masses, dark matter and the baryon asymmetry of the universe are the only evidence we have of physics beyond the Standard Model (BSM) of particle physics. Astroparticle theorists study how to extend the theory towards a new Standard Model – and the cosmological consequences of doing so.

New insights

For a long time, work on dark matter focused on TeV-scale models parallel to searches at the LHC and in ultra-low-noise detectors. The scope has now broadened to a much larger range of masses and models, from ultralight dark matter and axions to sub-GeV dark matter and WIMPs. Theoretical developments have gone hand-in-hand with new experimental opportunities. In the next 10 years, much larger detectors are planned for WIMP searches aiming towards the neutrino floor. Pioneering experimental efforts, even borrowing techniques from atomic and condensed-matter physics, test dark matter with much lower masses, providing new insights into what dark matter may be made of.

I strongly welcome efforts to broaden the reach in mass scales to efficiently hunt for any hint of what the new physics BSM may be

Neutrinos provide a complementary window on BSM physics. It is just over 25 years since the discovery of neutrino oscillation provided evidence that neutrinos have mass – a fact that cannot be accounted for in the SM (CERN Courier May/June 2024 p29). But the origin of neutrino masses remains a mystery. In the coming decade, neutrinoless double-beta decay experiments and new large experiments, such as JUNO, DUNE (see “A gold mine for neutrino physics“) and Hyper-Kamiokande, will provide a much clearer picture, determining the mass ordering and potentially discovering the neutrino’s nature and whether it violates CP symmetry. These results may, via leptogenesis, be related to the origin of the matter–antimatter asymmetry of the universe.

Recently, there has been renewed interest in models with scales accessible to current particle-physics experiments. These will exploit the powerful beams and capable detectors of the current and future experimental neutrino programme, and collider-based searches for heavy neutral leptons with MeV-to-TeV masses.

Overall, while the multi-TeV scale should continue to be a key focus for both particle and astroparticle physics experiments, I strongly welcome the theoretical and experimental efforts to broaden the reach in mass scales to efficiently hunt for any hint of what the new physics BSM may be.

Silvia Pascoli

Astroparticle physics also studies the particles that arrive on Earth from all around our universe. They come from extreme astrophysical environments, such as supernovae and active galactic nuclei, where they may be generated and accelerated to the highest energies. Thanks to their detection we can study the processes that fuel these astrophysical objects and gain an insight into their evolution (see “In defiance of cosmic-ray power laws“).

The discovery of gravitational waves (GWs) just a few years ago has shed new light on this field. Together with gamma rays, cosmic rays and the high-energy neutrinos detected at IceCube, the field of multi-messenger astronomy is in full bloom. In the coming years it will get a boost from the results of new, large experiments such as KM3Net, the Einstein Telescope, LISA and the Cherenkov Telescope Array – as well as many new theoretical developments, such as advanced particle-theory techniques for GW predictions.

In the field of GWs, last year’s results from pulsar timing arrays indicate the presence of a stochastic background of GWs. What is its origin? Is it of astrophysical nature or does it come from some dramatic event in the early universe, such as a strong first-order phase transition? In this latter case, we would be getting a glimpse of the universe when it was just born, opening up a new perspective on fundamental particles and interactions. Could it be that we have seen a new GeV-scale dark sector at work? It is too early to tell. But this is very exciting.

LHC physicists spill the beans in Boston

Dedicated solely to LHC physics, the LHCP conference is a vital gathering for experts in the field. The 12th edition was no exception, attracting 450 physicists to Northeastern University in Boston from 3 to 7 June. Participants discussed recent results, data taking at a significantly increased instantaneous luminosity in Run 3, and progress on detector upgrades planned for the high-luminosity LHC (HL-LHC).

The study of the Higgs boson remains central to the LHC programme. ATLAS reported a new result on Standard Model (SM) Higgs-boson production with decays to tau leptons, achieving the most precise single-channel measurement of the vector-boson-fusion production mode to date. Determining the production modes of the Higgs boson precisely may shed light on the existence of new physics that would be observed as deviations from the SM predictions.

Beyond single Higgs production, the di-Higgs production (HH) search is one of the most exciting and fundamental topics for LHC physics in the coming years as it directly probes the Higgs potential (see “Homing in on the Higgs self-interaction“). ATLAS has combined results for HH production in multiple final states, providing the best-expected sensitivity to the HH production cross-section and Higgs-boson self-coupling, allowing κλ (the Higgs self-coupling with respect to the SM value) to be within the range –1.2 < κλ< 7.2.

The search for beyond-the-SM (BSM) physics to explain the many unresolved questions about our universe is being conducted with innovative ideas and methods. CMS has presented new searches involving signatures with two tau leptons, examining the hypotheses of an excited tau lepton and a heavy neutral spin-1 gauge boson (Z) produced via Drell-Yan and, for the first time, via vector boson fusion. These results set stringent constraints on BSM models with enhanced couplings to third-generation fermions.

Other new-physics theoretical models propose additional BSM Higgs bosons. ATLAS presented a search for such particles being produced in association with top quarks, setting limits on their cross-section that significantly improve upon previous ATLAS  results. Additional BSM Higgs bosons could explain puzzles such as dark matter, neutrino oscillations and the observed matter–antimatter asymmetry in the universe.

The dark side

Some BSM models imply that dark-matter particles could arise as composite mesons or baryons of a new strongly-coupled theory that is an extension of the SM. ATLAS investigated this dark sector through searches for high-multiplicity hadronic final states, providing the first direct collider constraints on this model to complement direct dark-matter-detection experimental results.

CMS have used low-pileup inelastic proton–proton collisions to measure event-shape variables related to the overall distribution of charged particles. These measurements showed the particle distribution to be more isotropic than predicted by theoretical models.

LHCP conference talk

The LHC experiments also presented multiple analyses of proton–lead (p–Pb) and pp collisions, exploring the potential production of quark–gluon plasma (QGP) – a hot and dense phase of deconfined quarks and gluons found in the early universe that is frequently studied in heavy-ion Pb–Pb collisions, among others, at the LHC. Whether it can be created in smaller collision systems is still inconclusive.

ALICE reported a high-precision measurement of the elliptic flow of anti-helium-3 in QGP using the first Run-3 Pb–Pb run. The much larger data sample compared to the previous Run 2 measurement allowed ALICE to distinguish production models for these rarely produced particles for the first time. ALICE also reported the first measurement of an impact-parameter-dependent angular anisotropy in the decay of coherently photo-produced ρ0 mesons in ultra-peripheral Pb–Pb collisions. In these collisions, quantum interference effects cause a decay asymmetry that is inversely proportional to the impact parameter.

CMS reported its first measurement of the complete set of optimised CP-averaged observables from the process B0 K*0μ+μ. These measurements are significant because they could reveal indirect signs of new physics or subtle effects induced by low-energy strong interactions. By matching the current best experimental precision, CMS contributes to the ongoing investigation of this process.

LHCb presented measurements of the local and non-local contributions across the full invariant-mass spectrum of B0* K*0μ+μ, tests of lepton flavour universality in semileptonic b decays, and mixing and CP violation in D  Kπ decays.

The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations

From a theoretical perspective, progress in precision calculations has exceeded expectations. Many processes are now known to next-to-next-to-leading order or even next-to-next-to-next-to-leading order (N3LO) accuracy. The first parton distribution functions approximating N3LO accuracy have been released and reported at LHCP, and modern parton showers have set new standards in perturbative accuracy.

In addition to these advances, several new ideas and observables are being proposed. Jet substructure, for instance, is becoming a precision science and valuable tool due to its excellent theoretical properties. Effective field theory (EFT) methods are continuously refined and automated, serving as crucial bridges to new theories as many ultraviolet theories share the same EFT operators. Synergies between flavour physics, electroweak effects and high-transverse-momentum processes at colliders are particularly evident within this framework. The use of the LHC as a photon collider showcases the extraordinary versatility of LHC experiments and their synergy with theoretical advancements.

Discovery machine

The HL-LHC upgrade was thoroughly discussed, with several speakers highlighting the importance and uniqueness of its physics programme. This includes fundamental insights into the Higgs potential, vector-boson scattering, and precise measurements of the Higgs boson and other SM parameters. Thanks to the endless efforts by the four collaborations to improve their performances, the LHC already rivals historic lepton colliders for electroweak precision in many channels, despite the cleaner signatures of lepton collisions. The HL-LHC will be capable of providing extraordinarily precise measurements while also serving as a discovery machine for many years to come.

The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations. Preserving the unique expertise and knowledge cultivated within the CERN community is imperative. Next year’s LHCP conference will be held at National Taiwan University in Taipei from 5 to 10 June.

Sustainable accelerator project underway

Particle accelerators have become essential instruments to improve our health, the environment, our safety and our high-tech abilities, as well as unlocking new, fundamental insights into physics, chemistry and biology, and generally enabling scientific breakthroughs that will improve our lives. Accelerating particles to higher energies will always require a large amount of energy. In a society where energy sustainability is critical, keeping energy consumption as low as is reasonably possible is an unavoidable challenge for both research infrastructures (RIs) and industry, which collectively operate more than 40,000 accelerators.

Going green

Based on state-of-the-art technology, the portfolio of current and future accelerator-driven RIs in Europe could develop to consume up to 1% of Germany’s annual electricity demand. With the ambition to maintain the attractiveness and competitiveness of European RIs, and enable Europe’s Green Deal, the Innovate for Sustainable Accelerating Systems (iSAS) project has been approved by Horizon Europe. Its aim is to establish an enhanced collaboration in the field to broaden, expedite and amplify the development and impact of novel energy-saving technologies to accelerate particles.

In general terms, a particle accelerator has a system to create the particles to be accelerated, a system preparing beams with these particles, an accelerating system that effectively accelerates the particle beams, a magnet system to steer the beam, an experimental facility using the particles, and finally a beam dump. In linear accelerating structures, most of the electrical power taken from the grid to operate the accelerator is used by the accelerating system itself.

The core of an accelerating system is a series of cavities that can deliver a high-gradient electric field. For many modern accelerators, these cavities are superconducting and therefore cryogenically cooled to about 2 K. They are powered with radio frequency (RF) power generators to deliver the field at a specific frequency and accordingly to provide energy to the particle beams as they traverse. These superconducting RF (SRF) systems are the enabling technology for frontier accelerators, but are energy-intensive devices where only a fraction of the power extracted from the grid is effectively transmitted to the accelerated particles. In addition, the beam energy is radiated by recirculating beams and ultimately dumped and lost. As an example, the European XFEL’s superconducting RF system uses 5–6 MW for 0.1 MW of average beam power, leading to a power conversion of less than 3%.

The objective of iSAS is to innovate those technologies that have been identified as being a common core of SRF accelerating systems and that have the largest leverage for energy savings with a view to minimising the intrinsic energy consumption in all phases of operation. In the landscape of accelerator-driven RIs, solutions are being developed to reuse the waste heat produced, develop energy-efficient magnets and RF power generators, and operate facilities on opportunistic schedules when energy is available on the grid. The iSAS project has a complementary focus on the energy efficiency of the SRF accelerating technologies themselves. This will contribute to the vital transition to sustain the tremendous 20th-century applications of accelerator technology in an energy-conscious 21st century.

Interconnected technologies

Based on a recently established European R&D roadmap for accelerator technology and based on a collaboration between leading European research institutions and industry, several interconnected technologies will be developed, prototyped and tested, each enabling significant energy savings on their own in accelerating particles. The collection of energy-saving technologies will be developed with a portfolio of forthcoming applications in mind, and to explore energy-saving improvements in accelerator-driven RIs. Considering the developments realised, the new technologies will be coherently integrated into the parametric design of a new accelerating system, a linac SRF cryomodule, optimised to achieve high beam-power in accelerators with an energy consumption that is as low as reasonably possible. This new cryomodule design will enable Europe to develop and build future energy-sustainable accelerators and particle colliders.

iSAS has been approved by Horizon Europe to help develop novel energy-saving technologies to accelerate particles

On 15 and 16 April, the iSAS kick-off meeting was organised at IJCLab (Orsay, France) with around 100 participants. Each of the working groups enthusiastically presented their impactful R&D plans and, in all cases, concrete work has begun. To save energy from RF power systems, novel fast-reacting tuners are being developed to compensate rapidly for detuning of the cavity’s frequency caused by mechanical vibrations, and methods are being invented to integrate them into smart digital control systems. To save energy from the cryogenics, and based on the ongoing Horizon Europe I.FAST project, superconducting cavities with thin films of Nb3Sn are being further developed to operate with high performance at 4.2 K instead of 2 K, thereby reducing the grid-power to operate the cryogenic system. The cryogenic system requires three times less cooling power to maintain a 4.2 K bath at 4.2 K when heat is dissipated in the bath compared to maintaining a 2 K bath at 2 K. Finally, to save energy from the accelerated particle beam itself, the technology of energy recovery linacs (ERLs) is being improved to operate efficiently with high-current beams by developing novel higher-order mode dampers that significantly avoid heat loads in the cavities.

iSAS logo

To address the engineering challenges related to the integration of the new energy-saving technologies, an existing ESS cryovessel will be equipped with new cavities and novel dampers, and the resulting linac SRF cryomodule will be tested in operation in the PERLE accelerator at IJCLab (Orsay, France). PERLE is a growing international collaboration to demonstrate the performance of ERLs with high-power beams that would enable applications in future particle colliders. Its first phase is being implemented at IJCLab with the objective to have initial beams in 2028.

The timescale to innovate, prototype and test new accelerator technologies is inherently long, in some cases longer than the typical duration of R&D projects. It is therefore essential to continue to collaborate and enhance the R&D process so that energy-sustainable technologies can be implemented without delay, to avoid hampering the scientific and industrial progress enabled by accelerators. Accordingly, iSAS plans co-development with industrial partners to jointly achieve a technology readiness level that will be sufficient to enter the large-scale production phase of these new technologies.

Empowering industry

While the readiness of several energy-saving technologies will be prepared towards industrialisation with impact on current RIs, iSAS is also a pathfinder for sustainable future SRF particle accelerators and colliders. Through inter- and multidisciplinary research that delivers and combines various technologies, it is the long-term ambition of iSAS to reduce the energy footprint of SRF accelerators in future RIs by half, and even more when the systems are integrated in ERLs. Accordingly, iSAS will help maintain Europe’s leadership for breakthroughs in fundamental sciences and enable high-energy collider technology to go beyond the current frontiers of energy and intensity in an energy-sustainable way. In parallel, the new sustainable technologies will empower and stimulate European industry to conceive a portfolio of new applications and take a leading role in, for example, the semiconductor, particle therapy, security and environmental sectors.

Intrigue in charm hadronisation

ALICE figure 1

Quantum chromodynamics (QCD) is one of the pillars of the Standard Model of particle physics, but much remains to be understood about its emergent behaviours, and theoretical calculations often disagree. A new result from the ALICE collaboration has now added fresh intrigue to interpretations of hadronisation – the process by which quarks and gluons become confined inside colour-neutral groupings such as baryons and mesons.

The production of heavy charm and beauty quarks in proton–proton collisions at the LHC is a rather fast process (~7 × 10–24 s) and subject to perturbative QCD calculations. On the other hand, the transformation of heavy quarks into hadrons requires substantially more time (~3 × 10–23 s). This separation of time scales has motivated the idea that the hadronisation process of heavy quarks is independent of the colliding system and collision energy. However, the production of baryons carrying a heavy quark in proton–proton collisions at the LHC has been found to be enhanced compared to more elementary e+e collisions. This surprising finding seems to invalidate the concept of universal hadronisation of heavy quarks, which is an important basis for calculations of particle production in QCD.

A new dimension

Heavy-flavour baryons carrying charm and strange quarks add a new dimension to these measurements. Such measurements are challenging because they suffer from low production rates. Due to the short lifetime of charm baryons (typically a fraction of a picosecond), they are usually observed through the detection of their decay products. The probability of how often they decay into a particular set of daughter particles, known as the branching ratio (BR), is poorly known for many of the strange-charm baryons. Knowledge of the precise branching ratio is crucial for interpreting the production results of these baryons.

Recently, the ALICE collaboration has measured the production of Ωc0 (css) baryons via the semileptonic decay channel  Ωc0→ Ωe+νe (and its charge-conjugate modes) as a function of transverse momentum (pT) in proton–proton collisions at 13 TeV at midrapidity (|y| < 0.8). The Ωc0 candidates are built by pairing an electron or positron candidate track with an Ω baryon candidate using a Kalman Filter vertexing algorithm. The Ω candidates are reconstructed via the cascading decay chain Ω→ ΛK, followed by the decay Λ→ pπ. The missing momentum of the neutrino was corrected by using an unfolding technique. Figure 1 shows the invariant-mass distribution of the Ωc0 candidates.

ALICE figure 2

Figure 2 compiles measurements of the decay by CLEO, Belle and now ALICE. Due to the lack of an absolute BR, results are quoted relative to the BR of Ωc0→ Ωπ+. Combined with the earlier measurement of Ωc0→ Ωπ+, the relative probability of the two decay modes is obtained: BR(Ωc0→ Ωe+νe)/BR(Ωc0→ Ωπ+) = 1.12 ± 0.22 (stat.) ± 0.27 (syst.). The Belle and CLEO collaborations have measured this ratio to be 1.98 ± 0.13 (stat.) ± 0.08 (syst.) and 2.4 ± 1.1 (stat.) ± 0.2 (syst.). Model predictions using the light-front approach and light-cone sum rules predict values of 1.1 ± 0.2 and 0.71, respectively. Another approach calculates decay modes and probabilities of charmed-baryon decays based on SU(3)f flavour symmetry in the quark model, resulting in a computed branching fraction ratio of 1.35.

The ALICE result is consistent with theory calculations and is 2.3σ lower than the more precise value reported by the Belle collaboration. The present measurement provides constraints on the decay probabilities of the Ωc0 baryons. It demonstrates that such measurements are now possible at the LHC with a precision similar to that at e+e colliders.

With the ongoing Run 3 at the LHC and thanks to the recent upgrades, ALICE is on the way to collecting a data sample that is about a thousand times larger for these types of analyses, which will enable more precise measurements of other decay modes. Thanks to these data, we expect to resolve the question of universal hadronisation in the near future.

Zooming in on leptonic W decays

ATLAS figure 1

In the Standard Model of particle physics, the three charged lepton flavours couple to the electroweak gauge bosons W and Z with the same strength – an idea known as lepton flavour universality (LFU). This implies that differences in the rates of processes involving W or Z bosons together with electrons, muons and tau leptons should arise only from differences in the leptons’ masses. Experimental results agree with LFU at the 0.1–0.2% level in the decays of tau leptons, kaons and pions, but hints of deviations have been seen in B-meson decays, for example in the combination of measurements of B → D(*)τν and B → D(*)μν decays at the BaBar, Belle and LHCb experiments.

The W and Z bosons are so heavy that the probabilities for them to decay to electrons, muons and tau leptons are expected to be equal to very high precision, if LFU holds. This implies that the ratios of these probabilities such as R(μ/e), which compares W → μν and W → eν, and R(τ/μ), which compares W → τν and W → μν, should be unity. Experiments at the LEP electron–positron collider measured a surprisingly large value of R(τ/μ) = 1.070 ± 0.026, but a more precise measurement from the ATLAS collaboration at the LHC found R(τ/μ) = 0.992 ± 0.013, in agreement with LFU. This measurement made use of the large sample of top-quark pair events produced at ATLAS during Run 2 of the LHC from 2015 to 2018. These top-quark events can be cleanly selected, with each event containing two W bosons and two b-quarks produced from the decays of the top quarks.

In a new measurement, ATLAS has turned its attention to the comparison of W decays to muons and electrons, via the ratio R(μ/e). The collaboration again used top-quark pair events as a clean and copious source of W bosons. Counting the number of events with one electron from W → eν, one muon from W → μν, and one or two b-tagged jets, provides the cleanest way to measure the rate of top-quark pair production. But this rate can also be measured from the number of top-quark pair events with two electrons or two muons. If R(μ/e) = 1 and W → eν and W → μν decays occur with equal probability, the rates of such ee and μμ events should be the same, after correcting for detector efficiencies. Any difference would suggest a violation of LFU.

Some measurement uncertainties have similar effects on the ee and μμ final states, so they largely cancel in the ratio R(μ/e). However, electrons and muons behave in very different ways in the ATLAS detector, giving different detection efficiencies with differing and uncorrelated uncertainties that do not cancel in the ratio. To reduce the sensitivity of the measured R(μ/e) to these effects, the double ratio R(μ/e)/√(R(μμ/ee) was measured first, where R(μμ/ee) corresponds to the comparison of Z → μμ and Z → ee decay probabilities, determined from the same dataset. The final R(μ/e) was then obtained by making use of the very precise measurement of R(μμ/ee) from the LEP experiments and the SLD experiment at SLAC, which has an uncertainty of only 0.0028. This latter ratio acts as a calibration of the relative detection efficiencies of electrons and muons in ATLAS, reducing the associated uncertainties in R(μ/e).

The final result from this new ATLAS analysis is R(μ/e) = 0.9995 ± 0.0045, perfectly compatible with unity. The measurement is compared to previous results from LHC and LEP experiments (see figure 1). Thanks to the large data sample and careful control of all systematic uncertainties, it improves on the uncertainty of 0.006 from all previous measurements combined. At least in W decays, LFU survives intact.

CMS studies single-top production

CMS figure 1

Being the most massive known elementary particle, top quarks are a focus for precision measurements and searches for new phenomena. At the LHC, they are copiously produced in pairs via quantum chromodynamic (QCD) interactions, and, to a much lesser extent, in single modes through the electroweak force. Precisely measuring the single-top cross section provides a stringent test for the electroweak sector of the Standard Model (SM) of particle physics.

In September 2022, only four months after the start of the Run 3, the CMS collaboration released the first measurement using data at the new collision energy of 13.6 TeV: the production cross section of a top quark together with its antiparticle (tt). The collaboration can now also report a measurement of the production of a single top quark in association with a W boson (tW) based on the full dataset recorded in 2022. As well as testing the electroweak sector, constraining tW allows it to be better disentangled from the dominant tt process – a channel where precision improves our knowledge of higher orders of accuracy in perturbative QCD.

CMS figure 2

tW is a challenging measurement as it is 10 times less likely than tt production but has almost the same detection signature. This analysis selects events where both the top quark and the W boson ultimately decay to leptons. The signal therefore consists of two leptons (electrons or muons), a jet initiated from a bottom quark, and possibly extra jets coming from additional radiation. No single observable can discriminate the signal from the background, so a random forest (RF) is employed in events that contain either one or two jets, one of which comes from a bottom quark. The RF is a collection of decision trees collaborating to distinguish the tW signal from the tt background. The output of the RF, for events with one jet identified as coming from a bottom quark, is shown in figure 1. The higher the RF discriminant, the higher the relative proportion of signal events.

To achieve a higher precision, an extra handle is used to control the tt background: information from events with two b-quark jets. Such events are more likely to come from the decay of a tt pair. The measurement yields a precise value for the tW cross section. Figure 2 shows tW cross-section measurements by CMS at different centre-of-mass energies, including the new measurement in proton–proton collisions of 13.6 TeV. All measurements are consistent with state-of-the-art theory calculations. The first tW measurement at the new LHC energy frontier uses only part of the data but is already as precise as the earlier measurement, which used the entire Run 2 sample at 13 TeV. Exploiting the full Run 3 data sample will push the precision frontier forward and provide an even more stringent SM probe in the top quark sector.

LHCb squeezes D-meson mixing

LHCb figure 1

The weak force, unlike other fundamental forces, has a distinctive feature: its interactions slightly differ when involving quarks or antiquarks. This phenomenon, known as CP violation, allows for an asymmetry in the likelihood of a process occurring with matter compared to its antimatter counterpart, which is an essential requirement to explain the large dominance of matter in the universe. However, the size of CP violation predicted by the Standard Model (SM), and in accordance with experimental measurements so far, is not large enough to explain this cosmological imbalance. This is why physicists are actively searching for new sources of CP violation and striving to improve our understanding of the known ones. The phenomenology offered by the quantum-mechanical oscillations of neutral mesons into their antimatter counterparts, the antimesons, provides a particularly rich experimental ground for such studies.

The LHCb collaboration recently measured a set of parameters that determine the matter–antimatter oscillation of the neutral D0 meson into the D0 anti­meson with unprecedented precision. This enables the search for the predicted hitherto unobserved CP violation in this oscillation.

D0 mesons are composed of a charm quark and an up antiquark. Their oscillations are extremely slow, with an oscillation period over a thousand times longer than their lifetimes. As a result, only a very few D0 mesons transform before they decay. Oscillations are therefore identified as extremely small changes in the flavour mixture – matter or antimatter – as a function of the time at which the D0 or the D0 decays.

In LHCb’s analysis, the initial matter–antimatter flavour of the neutral meson is experimentally inferred from the charge of the accompanying pion in the CP-conserving decay chains D*(2010)+→ D0π+ and D*(2010)→ D0π. The mixing effect (or oscillation) then appears as a decay-time dependence of the ratio, R, of the number of “suppressed” and “favoured” decay processes of the neutral meson. The suppressed decays can occur with or without a net oscillation of the D0 meson, while the favoured decays are largely dominated by the direct process. In the absence of mixing, this ratio is predicted to be constant as a function of the D0 decay time while, in the case of mixing, it approximately follows a parabolic behaviour, increasing with time. Figure 1 shows the ratio R, including data for both matter (R+ for D0→ K+π) and antimatter (R for D0→ Kπ+) processes, and corresponding model predictions. The variation depends not only on the oscillation parameters but also on the various observables of CP violation, which differentiate between matter and antimatter.

This analysis is the most precise measurement of these parameters to date, improving the uncertainty on both mixing and CP-violating observables by a factor of 1.6 compared to the previous best result, also by LHCb. This improvement is largely due to an unpre­cedentedly large sample of about 1.6 million suppressed decays and 421 million favoured decays collected during Run 2, making LHCb unique in probing up-type quark transitions. The results confirm the matter–antimatter oscillation of the D0 meson and show no evidence of CP violation in the oscillation.

These findings call for future analyses of this and other decays of the D0 meson using data from the third and fourth run of the LHC, exploiting the potential of the currently operating detector upgrade (Upgrade I). The detector upgrade proposed for the fifth and sixth runs of the LHC (Upgrade II) would provide a six-times-bigger sample, yielding the precision needed to definitively test the predictions of the SM.

XFELs join hunt for axion-like particles

Bounds on axion–photon coupling

A first-of-its-kind experiment performed at the European X-Ray Free-Electron Laser (European XFEL) in Hamburg, Germany, has placed new constraints on axion-like particles in a mass range that is relatively unconstrained by laboratory searches. While similar searches have been performed at advanced storage ring-based synchrotron X-ray sources, the new study exploits the higher brightness of the European XFEL’s beams to improve the sensitivity of axion searches in the 10–3–104 eV mass range.

The axion is predicted to arise from the breaking of Peccei–Quinn symmetry, proposed in the mid-1970s to explain the observed absence of CP violation in strong interactions. Indeed, axion-like particles (ALPs) appear in any quantum field theory with a spontaneously broken global symmetry and arise naturally in many models based on string theory. They are also a promising candidate for dark matter. As such, ALPs are the target of a growing number and variety of experiments worldwide. While not yet able to reach the sensitivity of astrophysical experiments, lab-based searches are less model-dependent as they enable direct control of the axion production process.

Most laboratory searches for axions exploit the Primakoff effect: photons in the presence of a strong external electric field convert into axions, which then convert back into photons after passing through an opaque wall. This “light shining through a wall” technique has been employed in experiments with optical lasers and external magnetic fields, such as ALPS (and now ALPS II) at DESY and OSQAR at CERN. Stringent bounds on heavy axions have also been placed by the CERN Axion Solar Telescope, which looked for the conversion of photons to axions in the strong magnetic field of an LHC dipole magnet pointed at the Sun, and constraints have been set by accelerator experiments such as Belle II at KEK and NA64 at CERN.

The use of X-rays can increase the detection sensitivity by exploiting the strong electric fields (up to 1011 V m–1, which corresponds to magnetic field strengths of order 1 kT) present in crystalline materials. Gianluca Gregori of the University of Oxford and co-workers used the European XFEL’s HED/HiBEF instrument, in which axion production and photon regeneration are expected to take place via the electric field within a pair of germanium crystals. Orienting the crystals such that their lattice planes are parallel to one another leads to a coherent effect analogous to Bragg scattering, while the much shorter duration and higher brightness of photon pulses from the European XFEL compared to previous synchrotron X-ray experiments allows for a more accurate discrimination of the signal against background.

Using three days of beam time, the team was able to improve on previous lab-based searches at several discrete axion masses. For masses greater than about 200 eV, the team claims to have surpassed the sensitivity of bounds from all previous searches for lab-generated axions except those at NA64. Further improvements in sensitivity – for example by enabling a higher X-ray flux and bunch-number, and by cooling the first crystal to extend the data-acquisition time – are possible, says the team, perhaps bringing the estimated bounds close to the expectation for QCD axions to be dark matter.

“This study shows the power of XFELs, alongside their principal role in more applied domains, to probe fundamental physics mysteries,” says Gregori. “This experiment required a difficult interpretation of a non-standard measurement, and it is hoped that further work will improve on these first limits.”

Detectors in Particle Physics: A Modern Introduction

Detectors in Particle Physics: A Modern Introduction

Progress in elementary particle physics is driven by the development of radiation-detection technologies. From early photographic emulsions to the gargantuan modern systems that are deployed at particle accelerators and astrophysics experiments, radiation detectors use extraordinary means to disclose the nature and fundamental interactions of elementary particles.

In Detectors in Particle Physics, Georg Viehhauser and Tony Weidberg offer an accessible and comprehensive introduction to this intricate world. Addressed to graduate students in particle and nuclear physics, and more advanced researchers, this book provides the knowledge needed to understand and appreciate these indispensable tools. Building on their personal contributions to the conception, construction and operation of major detector systems at the DELPHI and ATLAS detectors at CERN, the authors review basic physics principles to enable the reader to grasp the fundamental operating mechanisms of gaseous, liquid and semiconductor detectors, as well as systems for particle identification and calorimetry.

In addition to exploring core concepts in detector physics, another objective of the book is to introduce the reader to case studies of applications in particle physics and astrophysics. From the Large Hadron Collider to neutrino experiments, the University of Oxford-based authors connect theoretical physics to practical applications and present real-world examples of modern detectors, bridging the gap between theory and experimentation. The book describes key practical aspects of particle detectors, including electronics, alignment, calibration and simulation. These practical insights enhance the reader’s understanding of how detectors operate in experiments, and each chapter includes practical exercises to help further the reader’s understanding of the subject.

Detectors in Particle Physics offers a unique blend of theoretical foundations and practical considerations. Whether you’re fascinated by the mysteries of the universe or planning a career in experimental physics, Viehhauser and Weidberg will undoubtedly prove to be a valuable resource.

bright-rec iop pub iop-science physcis connect