Dedicated solely to LHC physics, the LHCP conference is a vital gathering for experts in the field. The 12th edition was no exception, attracting 450 physicists to Northeastern University in Boston from 3 to 7 June. Participants discussed recent results, data taking at a significantly increased instantaneous luminosity in Run 3, and progress on detector upgrades planned for the high-luminosity LHC (HL-LHC).
The study of the Higgs boson remains central to the LHC programme. ATLAS reported a new result on Standard Model (SM) Higgs-boson production with decays to tau leptons, achieving the most precise single-channel measurement of the vector-boson-fusion production mode to date. Determining the production modes of the Higgs boson precisely may shed light on the existence of new physics that would be observed as deviations from the SM predictions.
Beyond single Higgs production, the di-Higgs production (HH) search is one of the most exciting and fundamental topics for LHC physics in the coming years as it directly probes the Higgs potential (see “Homing in on the Higgs self-interaction“). ATLAS has combined results for HH production in multiple final states, providing the best-expected sensitivity to the HH production cross-section and Higgs-boson self-coupling, allowing κλ (the Higgs self-coupling with respect to the SM value) to be within the range –1.2 < κλ< 7.2.
The search for beyond-the-SM (BSM) physics to explain the many unresolved questions about our universe is being conducted with innovative ideas and methods. CMS has presented new searches involving signatures with two tau leptons, examining the hypotheses of an excited tau lepton and a heavy neutral spin-1 gauge boson (Z′) produced via Drell-Yan and, for the first time, via vector boson fusion. These results set stringent constraints on BSM models with enhanced couplings to third-generation fermions.
Other new-physics theoretical models propose additional BSM Higgs bosons. ATLAS presented a search for such particles being produced in association with top quarks, setting limits on their cross-section that significantly improve upon previous ATLASresults. Additional BSM Higgs bosons could explain puzzles such as dark matter, neutrino oscillations and the observed matter–antimatter asymmetry in the universe.
The dark side
Some BSM models imply that dark-matter particles could arise as composite mesons or baryons of a new strongly-coupled theory that is an extension of the SM. ATLAS investigated this dark sector through searches for high-multiplicity hadronic final states, providing the first direct collider constraints on this model to complement direct dark-matter-detection experimental results.
CMS have used low-pileup inelastic proton–proton collisions to measure event-shape variables related to the overall distribution of charged particles. These measurements showed the particle distribution to be more isotropic than predicted by theoretical models.
The LHC experiments also presented multiple analyses of proton–lead (p–Pb) and pp collisions, exploring the potential production of quark–gluon plasma (QGP) – a hot and dense phase of deconfined quarks and gluons found in the early universe that is frequently studied in heavy-ion Pb–Pb collisions, among others, at the LHC. Whether it can be created in smaller collision systems is still inconclusive.
ALICE reported a high-precision measurement of the elliptic flow of anti-helium-3 in QGP using the first Run-3 Pb–Pb run. The much larger data sample compared to the previous Run 2 measurement allowed ALICE to distinguish production models for these rarely produced particles for the first time. ALICE also reported the first measurement of an impact-parameter-dependent angular anisotropy in the decay of coherently photo-produced ρ0 mesons in ultra-peripheral Pb–Pb collisions. In these collisions, quantum interference effects cause a decay asymmetry that is inversely proportional to the impact parameter.
CMS reported its first measurement of the complete set of optimised CP-averaged observables from the process B0→ K*0μ+μ–. These measurements are significant because they could reveal indirect signs of new physics or subtle effects induced by low-energy strong interactions. By matching the current best experimental precision, CMS contributes to the ongoing investigation of this process.
LHCb presented measurements of the local and non-local contributions across the full invariant-mass spectrum of B0*→ K*0μ+μ–, tests of lepton flavour universality in semileptonic b decays, and mixing and CP violation in D → Kπ decays.
The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations
From a theoretical perspective, progress in precision calculations has exceeded expectations. Many processes are now known to next-to-next-to-leading order or even next-to-next-to-next-to-leading order (N3LO) accuracy. The first parton distribution functions approximating N3LO accuracy have been released and reported at LHCP, and modern parton showers have set new standards in perturbative accuracy.
In addition to these advances, several new ideas and observables are being proposed. Jet substructure, for instance, is becoming a precision science and valuable tool due to its excellent theoretical properties. Effective field theory (EFT) methods are continuously refined and automated, serving as crucial bridges to new theories as many ultraviolet theories share the same EFT operators. Synergies between flavour physics, electroweak effects and high-transverse-momentum processes at colliders are particularly evident within this framework. The use of the LHC as a photon collider showcases the extraordinary versatility of LHC experiments and their synergy with theoretical advancements.
Discovery machine
The HL-LHC upgrade was thoroughly discussed, with several speakers highlighting the importance and uniqueness of its physics programme. This includes fundamental insights into the Higgs potential, vector-boson scattering, and precise measurements of the Higgs boson and other SM parameters. Thanks to the endless efforts by the four collaborations to improve their performances, the LHC already rivals historic lepton colliders for electroweak precision in many channels, despite the cleaner signatures of lepton collisions. The HL-LHC will be capable of providing extraordinarily precise measurements while also serving as a discovery machine for many years to come.
The future of the field was discussed in a well-attended panel session, which emphasised exploring the full potential of the HL-LHC and engaging younger generations. Preserving the unique expertise and knowledge cultivated within the CERN community is imperative. Next year’s LHCP conference will be held at National Taiwan University in Taipei from 5 to 10 June.
Quantum chromodynamics (QCD) is one of the pillars of the Standard Model of particle physics, but much remains to be understood about its emergent behaviours, and theoretical calculations often disagree. A new result from the ALICE collaboration has now added fresh intrigue to interpretations of hadronisation – the process by which quarks and gluons become confined inside colour-neutral groupings such as baryons and mesons.
The production of heavy charm and beauty quarks in proton–proton collisions at the LHC is a rather fast process (~7 × 10–24 s) and subject to perturbative QCD calculations. On the other hand, the transformation of heavy quarks into hadrons requires substantially more time (~3 × 10–23 s). This separation of time scales has motivated the idea that the hadronisation process of heavy quarks is independent of the colliding system and collision energy. However, the production of baryons carrying a heavy quark in proton–proton collisions at the LHC has been found to be enhanced compared to more elementary e+e–collisions. This surprising finding seems to invalidate the concept of universal hadronisation of heavy quarks, which is an important basis for calculations of particle production in QCD.
A new dimension
Heavy-flavour baryons carrying charm and strange quarks add a new dimension to these measurements. Such measurements are challenging because they suffer from low production rates. Due to the short lifetime of charm baryons (typically a fraction of a picosecond), they are usually observed through the detection of their decay products. The probability of how often they decay into a particular set of daughter particles, known as the branching ratio (BR), is poorly known for many of the strange-charm baryons. Knowledge of the precise branching ratio is crucial for interpreting the production results of these baryons.
Recently, the ALICE collaboration has measured the production of Ωc0 (css) baryons via the semileptonic decay channelΩc0→ Ω–e+νe (and its charge-conjugate modes) as a function of transverse momentum (pT) in proton–proton collisions at 13 TeV at midrapidity (|y| < 0.8). The Ωc0 candidates are built by pairing an electron or positron candidate track with an Ω baryon candidate using a Kalman Filter vertexing algorithm. The Ω candidates are reconstructed via the cascading decay chain Ω–→ ΛK–, followed by the decay Λ→ pπ–. The missing momentum of the neutrino was corrected by using an unfolding technique. Figure 1 shows the invariant-mass distribution of the Ωc0 candidates.
Figure 2 compiles measurements of the decay by CLEO, Belle and now ALICE. Due to the lack of an absolute BR, results are quoted relative to the BR of Ωc0→ Ω–π+. Combined with the earlier measurement of Ωc0→ Ω–π+, the relative probability of the two decay modes is obtained: BR(Ωc0→ Ω–e+νe)/BR(Ωc0→ Ω–π+) = 1.12 ± 0.22 (stat.) ± 0.27 (syst.). The Belle and CLEO collaborations have measured this ratio to be 1.98 ± 0.13 (stat.) ± 0.08 (syst.) and 2.4 ± 1.1 (stat.) ± 0.2 (syst.). Model predictions using the light-front approach and light-cone sum rules predict values of 1.1 ± 0.2 and 0.71, respectively. Another approach calculates decay modes and probabilities of charmed-baryon decays based on SU(3)f flavour symmetry in the quark model, resulting in a computed branching fraction ratio of 1.35.
The ALICE result is consistent with theory calculations and is 2.3σ lower than the more precise value reported by the Belle collaboration. The present measurement provides constraints on the decay probabilities of the Ωc0 baryons. It demonstrates that such measurements are now possible at the LHC with a precision similar to that at e+e– colliders.
With the ongoing Run 3 at the LHC and thanks to the recent upgrades, ALICE is on the way to collecting a data sample that is about a thousand times larger for these types of analyses, which will enable more precise measurements of other decay modes. Thanks to these data, we expect to resolve the question of universal hadronisation in the near future.
In the Standard Model of particle physics, the three charged lepton flavours couple to the electroweak gauge bosons W and Z with the same strength – an idea known as lepton flavour universality (LFU). This implies that differences in the rates of processes involving W or Z bosons together with electrons, muons and tau leptons should arise only from differences in the leptons’ masses. Experimental results agree with LFU at the 0.1–0.2% level in the decays of tau leptons, kaons and pions, but hints of deviations have been seen in B-meson decays, for example in the combination of measurements of B → D(*)τν and B → D(*)μν decays at the BaBar, Belle and LHCb experiments.
The W and Z bosons are so heavy that the probabilities for them to decay to electrons, muons and tau leptons are expected to be equal to very high precision, if LFU holds. This implies that the ratios of these probabilities such as R(μ/e), which compares W → μν and W → eν, and R(τ/μ), which compares W → τν and W → μν, should be unity. Experiments at the LEP electron–positron collider measured a surprisingly large value of R(τ/μ) = 1.070 ± 0.026, but a more precise measurement from the ATLAS collaboration at the LHC found R(τ/μ) = 0.992 ± 0.013, in agreement with LFU. This measurement made use of the large sample of top-quark pair events produced at ATLAS during Run 2 of the LHC from 2015 to 2018. These top-quark events can be cleanly selected, with each event containing two W bosons and two b-quarks produced from the decays of the top quarks.
In a new measurement, ATLAS has turned its attention to the comparison of W decays to muons and electrons, via the ratio R(μ/e). The collaboration again used top-quark pair events as a clean and copious source of W bosons. Counting the number of events with one electron from W → eν, one muon from W → μν, and one or two b-tagged jets, provides the cleanest way to measure the rate of top-quark pair production. But this rate can also be measured from the number of top-quark pair events with two electrons or two muons. If R(μ/e) = 1 and W → eν and W → μν decays occur with equal probability, the rates of such ee and μμ events should be the same, after correcting for detector efficiencies. Any difference would suggest a violation of LFU.
Some measurement uncertainties have similar effects on the ee and μμ final states, so they largely cancel in the ratio R(μ/e). However, electrons and muons behave in very different ways in the ATLAS detector, giving different detection efficiencies with differing and uncorrelated uncertainties that do not cancel in the ratio. To reduce the sensitivity of the measured R(μ/e) to these effects, the double ratio R(μ/e)/√(R(μμ/ee) was measured first, where R(μμ/ee) corresponds to the comparison of Z → μμ and Z → ee decay probabilities, determined from the same dataset. The final R(μ/e) was then obtained by making use of the very precise measurement of R(μμ/ee) from the LEP experiments and the SLD experiment at SLAC, which has an uncertainty of only 0.0028. This latter ratio acts as a calibration of the relative detection efficiencies of electrons and muons in ATLAS, reducing the associated uncertainties in R(μ/e).
The final result from this new ATLAS analysis is R(μ/e) = 0.9995 ± 0.0045, perfectly compatible with unity. The measurement is compared to previous results from LHC and LEP experiments (see figure 1). Thanks to the large data sample and careful control of all systematic uncertainties, it improves on the uncertainty of 0.006 from all previous measurements combined. At least in W decays, LFU survives intact.
Being the most massive known elementary particle, top quarks are a focus for precision measurements and searches for new phenomena. At the LHC, they are copiously produced in pairs via quantum chromodynamic (QCD) interactions, and, to a much lesser extent, in single modes through the electroweak force. Precisely measuring the single-top cross section provides a stringent test for the electroweak sector of the Standard Model (SM) of particle physics.
In September 2022, only four months after the start of the Run 3, the CMS collaboration released the first measurement using data at the new collision energy of 13.6 TeV: the production cross section of a top quark together with its antiparticle (tt). The collaboration can now also report a measurement of the production of a single top quark in association with a W boson (tW) based on the full dataset recorded in 2022. As well as testing the electroweak sector, constraining tW allows it to be better disentangled from the dominant tt process – a channel where precision improves our knowledge of higher orders of accuracy in perturbative QCD.
tW is a challenging measurement as it is 10 times less likely than tt production but has almost the same detection signature. This analysis selects events where both the top quark and the W boson ultimately decay to leptons. The signal therefore consists of two leptons (electrons or muons), a jet initiated from a bottom quark, and possibly extra jets coming from additional radiation. No single observable can discriminate the signal from the background, so a random forest (RF) is employed in events that contain either one or two jets, one of which comes from a bottom quark. The RF is a collection of decision trees collaborating to distinguish the tW signal from the tt background. The output of the RF, for events with one jet identified as coming from a bottom quark, is shown in figure 1. The higher the RF discriminant, the higher the relative proportion of signal events.
To achieve a higher precision, an extra handle is used to control the tt background: information from events with two b-quark jets. Such events are more likely to come from the decay of a tt pair. The measurement yields a precise value for the tW cross section. Figure 2 shows tW cross-section measurements by CMS at different centre-of-mass energies, including the new measurement in proton–proton collisions of 13.6 TeV. All measurements are consistent with state-of-the-art theory calculations. The first tW measurement at the new LHC energy frontier uses only part of the data but is already as precise as the earlier measurement, which used the entire Run 2 sample at 13 TeV. Exploiting the full Run 3 data sample will push the precision frontier forward and provide an even more stringent SM probe in the top quark sector.
The weak force, unlike other fundamental forces, has a distinctive feature: its interactions slightly differ when involving quarks or antiquarks. This phenomenon, known as CP violation, allows for an asymmetry in the likelihood of a process occurring with matter compared to its antimatter counterpart, which is an essential requirement to explain the large dominance of matter in the universe. However, the size of CP violation predicted by the Standard Model (SM), and in accordance with experimental measurements so far, is not large enough to explain this cosmological imbalance. This is why physicists are actively searching for new sources of CP violation and striving to improve our understanding of the known ones. The phenomenology offered by the quantum-mechanical oscillations of neutral mesons into their antimatter counterparts, the antimesons, provides a particularly rich experimental ground for such studies.
The LHCb collaboration recently measured a set of parameters that determine the matter–antimatter oscillation of the neutral D0 meson into the D0 antimeson with unprecedented precision. This enables the search for the predicted hitherto unobserved CP violation in this oscillation.
D0 mesons are composed of a charm quark and an up antiquark. Their oscillations are extremely slow, with an oscillation period over a thousand times longer than their lifetimes. As a result, only a very few D0 mesons transform before they decay. Oscillations are therefore identified as extremely small changes in the flavour mixture – matter or antimatter – as a function of the time at which the D0 or the D0 decays.
In LHCb’s analysis, the initial matter–antimatter flavour of the neutral meson is experimentally inferred from the charge of the accompanying pion in the CP-conserving decay chains D*(2010)+→ D0π+ and D*(2010)–→ D0π–. The mixing effect (or oscillation) then appears as a decay-time dependence of the ratio, R, of the number of “suppressed” and “favoured” decay processes of the neutral meson. The suppressed decays can occur with or without a net oscillation of the D0 meson, while the favoured decays are largely dominated by the direct process. In the absence of mixing, this ratio is predicted to be constant as a function of the D0 decay time while, in the case of mixing, it approximately follows a parabolic behaviour, increasing with time. Figure 1 shows the ratio R, including data for both matter (R+ for D0→ K+π–) and antimatter (R– for D0→ K–π+) processes, and corresponding model predictions. The variation depends not only on the oscillation parameters but also on the various observables of CP violation, which differentiate between matter and antimatter.
This analysis is the most precise measurement of these parameters to date, improving the uncertainty on both mixing and CP-violating observables by a factor of 1.6 compared to the previous best result, also by LHCb. This improvement is largely due to an unprecedentedly large sample of about 1.6 million suppressed decays and 421 million favoured decays collected during Run 2, making LHCb unique in probing up-type quark transitions. The results confirm the matter–antimatter oscillation of the D0 meson and show no evidence of CP violation in the oscillation.
These findings call for future analyses of this and other decays of the D0 meson using data from the third and fourth run of the LHC, exploiting the potential of the currently operating detector upgrade (Upgrade I). The detector upgrade proposed for the fifth and sixth runs of the LHC (Upgrade II) would provide a six-times-bigger sample, yielding the precision needed to definitively test the predictions of the SM.
The 15th edition of Electromagnetic Interactions with Nucleons and Nuclei (EINN) attracted 100 delegates to Paphos in Cyprus from 31 October to 4 November 2023. EINN covers theoretical and experimental developments in hadron physics, including the partonic structure of nucleons and hadron spectroscopy, the muon magnetic moment, dark-matter searches, the electroweak structure of light nuclei, new experimental facilities and physics searches, lattice QCD, the integration of machine-learning methodologies in QCD and the potential of quantum computing in QCD.
A highlight of the conference was the evening plenary poster session. Luis Alberto Rodriguez Chacon (The Cyprus Institute), Cornelis Mommers (Mainz University) and Sotiris Pitelis (Mainz) were recognised with the prestigious EPS poster prize, and presented their work on the calculation of the gluon momentum fraction in mesons through lattice QCD simulations, exotic atoms, and the X17 discovery potential from γD → e+e–pn with neutron tagging. This edition of EINN also hosted topical workshops on the QCD analysis of nucleon structure and experimental opportunities at the Electron-Ion Collider. Preceding the conference, a two-day meeting on careers in photonuclear physics was tailored to be a platform for PhD students and postdoctoral researchers to establish professional networks.
With QCD taking a central role in contemporary physics research worldwide, the EINN conference is poised to maintain its crucial role as an international forum for the field.
In the past two decades, it has become clear that three-quark baryons and quark–antiquark mesons cannot describe the full spectrum of hadrons. Dozens of exotic states have been observed in the charm sector alone. These states are either interpreted as compact objects with four or five valence quarks or as hadron molecules, however, their inner structures remain uncertain due to the complexity of calculations in quantum chromodynamics (QCD) and the lack of direct experimental measurements of the residual strong interaction between charm and light hadrons. New femtoscopy measurement by the ALICE collaboration challenge theoretical expectations and the current understanding of QCD.
Femtoscopy is a well-established method for studying the strong interactions between hadrons. Experimentally, this is achieved by studying particle pairs with small relative momentum. In high-energy collisions of protons at the LHC, the distance between such hadrons at the time of production is about one femtometre, which is within the range of the strong nuclear force. From the momentum correlations of particle pairs, one extracts the scattering length, a0, which quantifies the final-state strong interaction between the two hadrons. By studying the momentum correlations of emitted particle pairs, it is possible to access the final-state interactions of even short-lived hadrons such as D mesons.
The scattering lengths are significantly smaller than the theoretical predictions
The ALICE collaboration has now, for the first time, measured the interaction of open-charm mesons (D+ and D*+) with charged pions and kaons for all the charge combinations. The momentum correlation functions of each system were measured in proton–proton collisions in the LHC at a centre-of-mass energy of 13 TeV. As predicted by heavy-quark spin symmetry, the scattering lengths of Dπ and D*π agree with each other, but they are found to be significantly smaller than the theoretical predictions (figure 1). This implies that the interaction between these mesons can be fully explained by the Coulomb force, and the contribution from strong interactions is negligible within experimental precision. The small measured values of the scattering length challenge our understanding of the residual strong force of heavy-flavour hadrons in the non-perturbative limit of QCD.
These results also have an important impact on the study of the quark–gluon plasma (QGP) – a deconfined state of matter created in ultra-relativistic heavy-ion collisions. The rescattering of D mesons with the other hadrons (mostly pions and kaons) created in such collisions was thought to modify the D-meson spectra, in addition to the modification expected from the QGP formation. The present ALICE measurement demonstrates, however, that the effect of rescattering is expected to be very small.
More precise and systematic studies of charm–hadron interactions will be carried out with the upgraded ALICE detector in the upcoming years.
Steven Weinberg was a logical freight train – for many, the greatest theorist of the second half of the 20th century. It is timely to reflect on his legacy, the scientific component of which is laid out in a new collection of his publications selected by theoretical physicist Michael Duff (Imperial College).
Six chapters cover Weinberg’s most consequential contributions to effective field theory, the Standard Model, symmetries, gravity, cosmology and short-form popular science writing. I can’t identify any notable omissions and I doubt many others would, though some may raise an eyebrow at the exclusion of his paper deriving the Lee–Weinberg bound. Duff brings each chapter to life with first-hand anecdotes and details that will delight those of us most greatly separated from historical events. I am relatively young, and only had one meaningful interaction with Steven Weinberg.Though my contemporaries and I inhabit a scientific world whose core concepts are interwoven with, if not formed by, Steven Weinberg’s scientific legacy, unlike Michael Duff we are poorly qualified to comment historically on the ecosystem in which this legacy grew, nor on aspects of personality. This makes his commentary particularly valuable to younger readers.
I can envisage three distinct audiences for this new collection. The first is the lay theorist – those who are widely enough read to recognise the depth of Weinberg’s impact in theoretical physics and would like to know more. Such readers will find Duff’s introductions to be insightful and entertaining – helpful preparation for the more technical aspects of the papers, though expertise is required to fully grapple with many of them. There are also a few hand-picked non-technical articles one would otherwise not encounter without some serious investigative effort, including some accessible articles on quantum field theory, effective field theory and life in the multiverse, in addition to the dedicated section on popular articles. These will delight any theory afficionado.
The second audience is practising theorists. If you’re going to invest in a printed collection of publications, then Weinberg is an obvious protagonist. Particle theorists consult his articles so often that they may as well have them close at hand. This collection contains those most often revisited and ought to be useful in this respect. Duff’s introductions also expose technical interconnections between the articles that might otherwise be missed.
The third audience I have in mind are beginning graduate students in particle theory, cosmology and beyond. It would not be a mistake to put this collection on recommended reading lists. In due course, most students should read many of these papers multiple times, so why not get on with it from the get-go? The section on effective field theories (EFTs) contains many valuable key ideas and perspectives. Plenty of those core concepts are still commonly encountered more by osmosis than with any rigour, and this can lead to confused notions around the general approach of EFT. Perhaps an incomplete introduction to EFT could be avoided for graduate students by cutting straight to the fundamentals contained here? The cosmology section also reveals many important modern concepts alongside lucid and fearless wrestling with big questions. The papers on gravity detail techniques that are frequently encountered in any first foray into modern amplitudology, as well as strategies to infer general lessons in quantum field theory from symmetries and self-consistency alone.
In my view, however, the most important section for beginning graduate students is that on the construction of the Standard Model (SM). It may be said that a collective amnesia has emerged regarding the scientific spirit that drove its development. The SM was built by model builders. I don’t say this facetiously. They made educated guesses about the structure of the “ultraviolet” (microscopic) world based on the “infrared” (long-distance) breadcrumbs embedded within low-energy experimental observations. Decades after this swashbuckling era came to an end, there is a growing tendency to view the SM as something rigid, providentially bestowed and permanent. The academic bravery and risk-taking that was required to take the necessary leaps forward then, and which may be required now, is no better demonstrated than in “A Model of Leptons”. All young theorists should read this multiple times. A Model of Leptons exemplifies that not only was Steven Weinberg an unstoppable force of logic, but also a plucky risk taker. It’s inspirational that its final paragraph, which laid out the structure of nature at the electroweak scale, ends with doubt and speculation: “And if this model is renormalisable, then what happens when we extend it to include the couplings of A and B to the hadrons?” By working their way through this collection, graduate students may be inspired to similar levels of ambition and jeopardy.
Amongst the greatest scientists of the last century
In the weeks that followed the passing of Stephen Weinberg, I sensed amongst a number of colleagues of all generations some moods that I could have anticipated; of the loss of not only a bona fide truth-seeker, but also of a leader, frequently the leader. I also perceived a feeling that transcended the scientific realm alone, of someone whose creative genius ought to be recognised amongst the greatest of scientists, musicians, artists and humanity of the last century. How can we productively reflect on that? I imagine we would all do well to learn not only of Weinberg’s important individual scientific insights, but also to attempt to absorb his overall methodology in identifying interesting questions, in breaking new trails in fundamental physics, and in pursuing logic and clarity wherever they may take you. This collection is not a bad place to start.
Since their first direct detection in 2015, gravitational waves (GWs) have become pivotal in our quest to understand the universe. The ultra-high-frequency (UHF) band offers a window to discover new physics beyond the Standard Model (CERN Courier March/April 2022 p22). Unleashing this potential requires theoretical work to investigate possible GW sources and experiments with far greater sensitivities than those achieved today.
A workshop at CERN from 4 to 8 December 2023 leveraged impressive experimental progress in a range of fields. Attended by nearly 100 international scientists – a noteworthy increase from the 40 experts who attended the first workshop at ICTP Trieste in 2019 – the workshop showcased the field’s expanded research interest and collaborative efforts. Concretely, about 10 novel detector concepts have been developed since the first workshop.
One can look for GWs in a few different ways: observing changes in the space between detector components, exciting vibrations in detectors, and converting GWs into electromagnetic radiation in strong magnetic fields. Substantial progress has been made in all three experimental directions.
Levitating concepts
The leading concepts for the first approach involve optically levitated sensors such as high-aspect-ratio sodium–cyttrium–fluoride prisms, and semi-levitated sensors such as thin silicon or silicon–nitride nanomembranes in long optical resonators. These technologies are currently under study by various groups in the Levitated Sensor Detectors collaboration and at DESY.
For the second approach, the main focus is on millimetre-scale quartz cavities similar to those used in precision clocks. A network of such detectors, known as GOLDEN, is being planned, involving collaborations among UC Davis, University College London and Northwestern University. Superconducting radio-frequency cavities also present a promising technology. A joint effort between Fermilab and DESY is leveraging the existing MAGO prototype to gain insights and design further optimised cavities.
Regarding the third approach, a prominent example is optical high-precision interferometry, combined with a series of accelerator dipole magnets similar to those used in the light-shining-through-a-wall axion-search experiment, ALPS II (Any Light Particle Search II) or the axion helioscope CAST and its planned successor IAXO. In fact, ALPS II is anticipated to commence a dedicated GW search in 2028. Additionally, other notable concepts inspired by axion dark-matter searches involve toroidal magnets, exemplified by experiments like ABRACADABRA, or solenoidal magnets such as BASE or MADMAX.
All three approaches stand to benefit from burgeoning advances in quantum sensing, which promise to enhance sensitivities by orders of magnitude. In this landscape, axion dark-matter searches and UHF GW detection are poised to work in close collaboration, leveraging quantum sensing to achieve unprecedented results. Concepts that demonstrate synergies with axion-physics searches are crucial at this stage, and can be facilitated by incremental investments. Such collaboration builds awareness within the scientific community and presents UHF searches as an additional, compelling science case for their construction.
The workshop showcased the field’s expanded research interest and collaborative efforts
Cross-disciplinary research is also crucial to understand cosmological sources and constraints on UHF GWs. For the former, our understanding of primordial black holes has significantly matured, transitioning from preliminary estimates to a robust framework. Additional sources, such as parabolic encounters and exotic compact objects, are also gaining clarity. For the latter, the workshop highlighted how strong magnetic fields in the universe, such as those in extragalactic voids and planetary magnetospheres, can help set limits on the conversion between electromagnetic and gravitational waves.
Despite much progress, the sensitivity needed to detect UHF GWs remains a visionary goal, requiring the constant pursuit of inventive new ideas. To aid this, the community is taking steps to be more inclusive. The living review produced after the first workshop (arXiv:2011.12414) will be revised to be more accessible for people outside our community, breaking down detector concepts into fundamental building blocks for easier understanding. Plans are also underway to establish a comprehensive research repository and standardise data formats. These initiatives are crucial for fostering a culture of open innovation and expanding the potential for future breakthroughs in UHF GW research. Finally, a new, fully customisable and flexible GW plotter including the UHF frequency range is being developed to benefit the entire GW community.
The journey towards detecting UHF GWs is just beginning. While current sensitivities are not yet sufficient, the community’s commitment to developing innovative ideas is unwavering. With the collective efforts of a dedicated scientific community, the next leap in gravitational-wave research is on the horizon. Limits exist to be surpassed!
After all these years, neutrinos remain extraordinary – and somewhat deceptive. The experimental success of the three-massive-neutrino paradigm over the past 25 years makes it easy to forget that massive neutrinos are not part of the Standard Model (SM) of particle physics.
The problem lies with how neutrinos acquire mass. Nonzero neutrino masses are not possible without the existence of new fundamental fields, beyond those that are part of the SM. And we know virtually nothing about the particles associated with them. They could be bosons or fermions, light or heavy, charged or neutral, and experimentally accessible or hopelessly out of reach.
This is the neutrino mass puzzle. At its heart is the particle’s uniquely elusive nature, which is both the source of the problem and the main challenge in resolving it.
Mysterious and elusive
Despite outnumbering other known massive particles in the universe by 10 orders of magnitude, neutrinos are the least understood of the matter particles. Unlike electrons, they do not participate in electromagnetic interactions. Unlike quarks, they do not participate in the strong interactions that bind protons and neutrons together. Neutrinos participate only in aptly named weak interactions. Out of the trillions of neutrinos that the Sun beams through you each second, only a handful will interact with your body during your lifetime.
Neutrino physics has therefore had a rather tortuous and slow history. The existence of neutrinos was postulated in 1930 but only confirmed in the 1950s. The hypothesis that there are different types of neutrinos was first raised in the 1940s but only confirmed in the 1960s. And the third neutrino type, postulated when the tau lepton was discovered in the 1970s, was only directly observed in the year 2000. Nonetheless, over the years neutrino experiments have played a decisive role in the development of the most successful theory in modern physics: the SM. And at the turn of the 21st century, neutrino experiments revealed that there is something missing in its description of particle physics.
Neutrinos are fermions with spin one-half that interact with the charged leptons (the electron, muon and tau lepton) and the particles that mediate the weak interactions (the W and Z bosons). There are three neutrino types, or flavours: electron-type (νe), muon-type (νμ) and tau-type (ντ), and each interacts exclusively with its namesake charged lepton. One of the predictions of the SM is that neutrino masses are exactly zero, but a little over 25 years ago, neutrino experiments revealed that this is not exactly true. Neutrinos have tiny but undeniably nonzero masses.
Mixing it up
The search for neutrino masses is almost as old as Pauli’s 93-year-old postulate that neutrinos exist. They were ultimately discovered around the turn of the millennium through the observation of neutrino flavour oscillations. It turns out that we can produce one of the neutrino flavours (for example νμ) and later detect it as a different flavour (for example νe) so long as we are willing to wait for the neutrino flavour to change. The probability associated with this phenomenon oscillates in spacetime with a characteristic distance that is inversely proportional to the differences of the squares of the neutrino masses. Given the tininess of neutrino masses and mass splittings, these distances are frequently measured in hundreds of kilometres in particle-physics experiments.
Neutrino oscillations also require the leptons to mix. This means that the neutrino flavour states are not particles with a well defined mass but are quantum superpositions of different neutrino states with well defined masses. The three mass eigenstates are related to the three flavour eigenstates via a three-dimensional mixing matrix, which is usually parameterised in terms of mixing angles and complex phases.
In the last few decades, precision measurements of neutrinos produced in the Sun, in the atmosphere, in nuclear reactors and in particle accelerators in different parts of the world, have measured the mixing parameters at the several percent level. Assuming the mixing matrix is unitary, all but one have been shown to be nonzero. The measurements have revealed that the three neutrino mass eigenvalues are separated by two different mass-squared differences: a small one of order 10–4 eV2 and a large one of order 10–3 eV2. Data therefore reveal that at least two of the neutrino masses are different from zero. At least one of the neutrino masses is above 0.05 eV, and the second lightest is at least 0.008 eV. While neutrino oscillation experiments cannot measure the neutrino masses directly, precise measurements of beta-decay spectra and constraints from the large-scale structure of the universe offer complementary upper limits. The nonzero neutrino masses are constrained to be less than roughly 0.1 eV.
These masses are tiny when compared to the masses of all the other particles (see “Chasm” figure). The mass of the lightest charged fermion, the electron, is of order 106 eV. The mass of the heaviest fermion, the top quark, is of order 1011 eV, as are the masses of the W, Z and Higgs bosons. These particle masses are all at least seven orders of magnitude heavier than those of the neutrinos. No one knows why neutrino masses are dramatically smaller than those of all other massive particles.
The Standard Model and mass
To understand why the SM predicts neutrino masses to be zero, it is necessary to appreciate that particle masses are complicated in this theory. The reason is as follows. The SM is a quantum field theory. Interactions between the fields are strictly governed by their properties: spin, various “local” charges, which are conserved in interactions, and – for fermions like the neutrinos, charged leptons and quarks – another quantum number called chirality.
In quantum field theories, mass is the interaction between a right-chiral and a different left-chiral field. A naive picture is that the mass-interaction constantly converts left-chiral states into right-chiral ones (and vice versa) and the end result is a particle with a nonzero mass. It turns out, however, that for all known fermions, the left-chiral and right-chiral fermions have different charges. The immediate consequence of this is that you can’t turn one into the other without violating the conservation of some charge so none of the fermions are allowed to have mass: the SM naively predicts that all fermion masses are zero!
The Higgs field was invented to fix this shortcoming. It is charged in such a way that some right-chiral and left-chiral fermions are allowed to interact with one another plus the Higgs field which, uniquely among all known fields, is thought to have been turned on everywhere since the phase transition that triggered electroweak symmetry breaking very early in the history of the universe. In other words, so long as the vacuum configuration of the Higgs field is not trivial, fermions acquire a mass thanks to these interactions.
This is not only a great idea; it is also at least mostly correct, as spectacularly confirmed by the discovery of the Higgs boson a little over a decade ago. It has many verifiable consequences. One is that the strength with which the Higgs boson couples to different particles is proportional to the particle ’s mass – the Higgs prefers to interact with the top quark or the Z or W bosons relative to the electron or the light quarks. Another consequence is that all masses are proportional to the value of the Higgs field in the vacuum (1011 eV) and, in the SM, we naively expect all particle masses to be similar.
Neutrino masses are predicted to be zero because, in the SM, there are no right-chiral neutrino fields and hence none for the left-chiral neutrinos – the ones we know about – to “pair up” with. Neutrino masses therefore require the existence of new fields, and hence new particles, beyond those in the SM.
Wanted: new fields
The list of candidate new fields is long and diverse. For example, the new fields that allow for nonzero neutrino masses could be fermions or bosons; they could be neutral or charged under SM interactions, and they could be related to a new mass scale other than the vacuum value of the SM Higgs field (1011 eV), which could be either much smaller or much larger. Finally, while these new fields might be “easy” to discover with the current and near-future generation of experiments, they might equally turn out to be impossible to probe directly in any particle-physics experiment in the foreseeable future.
Though there are too many possibilities to list, they can be classified into three very broad categories: neutrinos acquire mass by interacting with the same Higgs field that gives mass to the charged fermions; by interacting with a similar Higgs field with different properties; or through a different mechanism entirely.
At first glance, the simplest idea is to postulate the existence of right-chiral neutrino fields and further assume they interact with the Higgs field and the left-chiral neutrinos, just like right-chiral and left-chiral charged leptons and quarks. There is, however, something special about right-chiral neutrino fields: they are completely neutral relative to all local SM charges. Returning to the rules of quantum field theory, completely neutral chiral fermions are allowed to interact “amongst themselves” independent of whether there are other right-chiral or left-chiral fields around. This means the right-chiral neutrino fields should come along with a different mass that is independent from the vacuum value of the Higgs field of 1011 eV.
To prevent this from happening, the right-chiral neutrinos must possess some kind of conserved charge that is shared with the left-chiral neutrinos. If this scenario is realised, there is some new, unknown fundamental conserved charge out there. This hypothetical new charge is called lepton number: electrons, muons, tau leptons and neutrinos are assigned charge plus one, while positrons, antimuons, tau antileptons and antineutrinos have charge minus one. A prediction of this scenario is that the neutrino and the antineutrino are different particles since they have different lepton numbers. In more technical terms, the neutrinos are massive Dirac fermions, like the charged leptons and the quarks. In this scenario, there are new particles associated with the right-chiral neutrino field, and a new conservation law in nature.
Accidental conservation
As of today, there is no experimental evidence that lepton number is not conserved, and readers may question if this really is a new conservation law. In the SM, however, the conservation of lepton number is merely “accidental” – once all other symmetries and constraints are taken into account, the theory happens to possess this symmetry. But lepton number conservation is no longer an accidental symmetry when right-chiral neutrinos are added, and these chargeless and apparently undetectable particles should have completely different properties if it is not imposed.
If lepton number conservation is imposed as a new symmetry of nature, making neutrinos pure Dirac fermions, there appears to be no observable consequence other than nonzero neutrino masses. Given the tiny neutrino masses, the strength of the interaction between the Higgs boson and the neutrinos is predicted to be at least seven orders of magnitude smaller than all other Higgs couplings to fermions. Various ideas have been proposed to explain this remarkable chasm between the strength of the neutrino’s interaction with the Higgs field relative to that of all other fermions. They involve a plurality of theoretical concepts including extra-dimensions of space, mirror copies of our universe and dark sectors.
A second possibility is that there are more Higgs fields in nature and that the neutrinos acquire a mass by interacting with a Higgs field that is different from the one that gives a mass to the charged fermions. Since the neutrino mass is proportional to the vacuum value of a different Higgs field, the fact that the neutrino masses are so small is easy to tolerate: they are simply proportional to a different mass scale that could be much smaller than 1011 eV. Here, there are no right-chiral neutrino fields and the neutrino masses are interactions of the left-chiral neutrino fields amongst themselves. This is possible because, while the neutrinos possess weak-force charge they have no electric charge. In the presence of the nontrivial vacuum of the Higgs fields, the weak-force charge is effectively not conserved and these interactions may be allowed. The fact that the Higgs particle discovered at the LHC – associated with the SM Higgs field – does not allow for this possibility is a consequence of its charges. Different Higgs fields can have different weak-force charges and end up doing different things. In this scenario, the neutrino and the antineutrino are, in fact, the same particle. In more technical terms: the neutrinos are massive Majorana fermions.
Neutrino masses require the existence of new fields, and hence new particles, beyond those in the Standard Model
One way to think about this is as follows: the mass interaction transforms left-chiral objects into right-chiral objects. For electrons, for example, the mass converts left-chiral electrons into right-chiral electrons. It turns out that the antiparticle of a left-chiral object is right-chiral and vice versa, and it is tempting to ask whether a mass interaction could convert a left-chiral electron into a right-chiral positron. The answer is no: electrons and positrons are different objects and converting one into the other would violate the conservation of electric charge. But this is no barrier for the neutrino, and we can contemplate the possibility of converting a left-chiral neutrino into its right-chiral antiparticle without violating any known law of physics. If this hypothesis is correct, the hypothetical lepton-number charge, discussed earlier, cannot be conserved. This hypothesis is experimentally neither confirmed nor contradicted but could soon be confirmed with the observation of neutrinoless double-beta decays – nuclear decays which can only occur if lepton-number symmetry is violated. There is an ongoing worldwide campaign to search for the neutrinoless double-beta decay of various nuclei.
A new source of mass
In the third category, there is a source of mass different from the vacuum value of the Higgs field, and the neutrino masses are an amalgam of the vacuum value of the Higgs field and this new source of mass. A very low new mass scale might be discovered in oscillation experiments, while consequences of heavier ones may be detected in other types of particle-physics experiments, including measurements of beta and meson decays, charged-lepton properties, or the hunt for new particles at high-energy colliders. Searches for neutrinoless double-beta decay can reveal different sources for lepton-number violation, while ultraheavy particles can leave indelible footprints in the structure of the universe through cosmic collisions. The new physics responsible for nonzero neutrino masses might also be related to grand-unified theories or the origin of the matter–antimatter asymmetry of the universe, through a process referred to as leptogenesis. The range of possibilities spans 22 orders of magnitude (see “eV to ZeV” figure).
Challenging scenarios
Since the origin of the neutrino masses here is qualitatively different from that of all other particles, the values of the neutrino masses are expected to be qualitatively different. Experimentally, we know that neutrino masses are much smaller than all charged- fermion masses, so many physicists believe that the tiny neutrino masses are strong indirect evidence for a source of mass beyond the vacuum value of the Higgs field. In most of these scenarios, the neutrinos are also massive Majorana fermions. The challenge here is that if a new mass scale exists in fundamental physics, we know close to nothing about it. It could be within direct reach of particle-physics experiments, or it could be astronomically high, perhaps as large as 1012 times the vacuum value of the SM’s Higgs field.
Searching for neutrinoless double-beta decay is the most promising avenue to reveal whether neutrinos are Majorana or Dirac fermions
How do we hope to learn more? We need more experimental input. There are many outstanding questions that can only be answered with oscillation experiments. These could provide evidence for new neutrino-like particles or new neutrino interactions and properties. Meanwhile, searching for neutrinoless double-beta decay is the most promising avenue to experimentally reveal whether neutrinos are Majorana or Dirac fermions. Other activities include high-energy collider searches for new Higgs bosons that like to talk to neutrinos and new heavy neutrino-like particles that could be related to the mechanism of neutrino mass generation. Charged-lepton probes, including measurements of the anomalous magnetic moment of muons and searches for lepton-flavour violation, may provide invaluable clues, while surveys of the cosmic microwave background and the distribution of galaxies could also reveal footprints of the neutrino masses in the structure of the universe.
We still know very little about the new physics uncovered by neutrino oscillations. Only a diverse experimental programme will reveal the nature of the new physics behind the neutrino mass puzzle.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.