Comsol -leaderboard other pages

Topics

ATLAS and CMS find first evidence for H → Zγ

The discovery of the Higgs boson in 2012 unleashed a detailed programme of measurements by ATLAS and CMS which have confirmed that its couplings are consistent with those predicted by the Standard Model (SM). However, several Higgs-boson decay channels have such small predicted branching fractions that they have not yet been observed. Involving higher order loops, these channels also provide indirect probes of possible physics beyond the SM. ATLAS and CMS have now teamed up to report the first evidence of the decay H  Zγ, presenting the combined result at the Large Hadron Collider Physics conference in Belgrade in May. 

The SM predicts that approximately 0.15% of Higgs bosons produced at the LHC will decay in this way, but some theories beyond the SM predict a different decay rate. Examples include models where the Higgs boson is a neutral scalar of different origin, or a composite state. Different branching fractions are also expected for models with additional colourless charged scalars, leptons or vector bosons that couple to the Higgs boson, due to their contributions via loop corrections. 

“Each particle has a special relationship with the Higgs boson, making the search for rare Higgs decays a high priority,” says ATLAS physics coordinator Pamela Ferrari. “Through a meticulous combination of the individual results of ATLAS and CMS, we have made a step forward towards unravelling yet another riddle of the Higgs boson.”

We have made a step forward towards unravelling yet another riddle of the Higgs boson

Previously, ATLAS and CMS independently conducted extensive searches for H  Zγ. Both used the decay of a Z boson into pairs of electrons or muons, which occur in about 6.6% of cases, to identify H  Zγ events. In these searches, the collision events associated with this decay would be identified as a narrow peak over a smooth background of events. 

In the new study, ATLAS and CMS combined data that was collected during the second run of the LHC in 2015–2018 to significantly increase the statistical precision and reach of their searches. This collaborative effort resulted in the first evidence of the Higgs boson decay into a Z boson and a photon, with a statistical significance of 3.4σ. The measured signal rate relative to the SM prediction was found to be 2.2 ± 0.7, in agreement with the theoretical expectation from the SM.

“The existence of new particles could have very significant effects on rare Higgs decay modes,” says CMS physics coordinator Florencia Canelli. “This study is a powerful test of the Standard Model. With the ongoing third run of the LHC and the future High-Luminosity LHC, we will be able to improve the precision of this test and probe ever rarer Higgs decays.”

LHCb sets record precision on CP violation

Comparison of sin2β measurements

At a CERN seminar on 13 June, the LHCb collaboration presented the world’s most precise measurements of two key parameters relating to CP violation. Based on the full LHCb dataset collected during LHC Runs 1 and 2, the first concerns the observable sin2β while the second concerns the CP-violating phase φs – both of which are highly sensitive to potential new-physics contributions. 

CP violation was first observed in 1964 in kaon mixing, and confirmed among B mesons in 2001 by the e+e B-factory experiments BaBar and Belle. The latter enabled the first measurements of sin2β and were a vital confirmation of the Standard Model (SM). In the SM, CP violation arises due to a complex phase in the Cabibbo–Kobayashi–Maskawa mixing matrix, which, being unitary, defines a triangle in the complex plane: one side is defined to have unit length, while the other two sides and three angles must be inferred via measurements of certain hadron decays. If the measurements do not provide a consistent description of the triangle, it would hint that something is amiss in the SM. 

The measurement of sin2β, which determines the angle β in the unitarity triangle, is more difficult at a hadron collider than it is at an e+e collider. However, the large data samples available at the LHC and the optimised design of the LHCb experiment have enabled a measurement that is twice as precise as the previous best result from Belle. The LHCb team used decays of B0 mesons to J/ψ K0S, which can proceed either directly or by first oscillating into their antimatter partners. The interference between the amplitudes for the two decay paths results in a time-dependent asymmetry between the decay-time distributions of the B0 and B0. The amplitude of the oscillation, and thus the magnitude of CP violation present, is a measurement of sin2β for which LHCb finds a value of 0.716 ± 0.013 ± 0.008, in agreement with predictions.

Based on an analysis of B0S J/ψ K+K decays, LHCb also presented the world’s best measurement of the CP-violating phase φs, which plays a similar role in B0S meson decays as sin2β does in B0 decays. As for B0 mesons, a B0S may decay directly or oscillate into a B0S and then decay. CP violation causes these decays to proceed at slightly different rates, manifesting itself as a non-zero value of φs due to the interference between mixing and decay. The predicted value of φs is about –0.037 rad, but new-physics effects, even if also small, could change its value significantly.

A detailed study of the angular distribution of B0S decay products using the Run 1 and 2 data samples enabled LHCb to measure this decay-time-dependent CP asymmetry φs = -0.039 ± 0.022 ± 0.006 rad. Representing the most precise single measurement to date, it is consistent with previous measurements and with the SM expectation. The precision measurement of φs is one of LHCb’s most important goals, said co-presenter Vukan Jevtic (TU Dortmund): “Together with sin2β, the new LHCb result marks an important advance in the quest to understand the nature and origin of CP violation.” 

With both results currently limited by statistics, the collaboration is looking forward to data from the current and future LHC runs. “In Run 3 LHCb will collect a larger data sample taking advantage of the new upgraded LHCb detector,” concluded co-presenter Peilian Li (CERN). “This will allow even higher precision and therefore the possibility to detect, through these key quantities, the manifestation of new-physics effects.”

CERN’s neutrino odyssey

The first candidate leptonic neutral-current event

The neutrino had barely been known for two years when CERN’s illustrious neutrino programme got under way. As early as 1958, the 600 MeV Synchro­cyclotron enabled the first observation of the decay of a charged pion into an electron and a neutrino – a key piece in the puzzle of weak interactions. Dedicated neutrino-beam experiments began a couple of years later when the Proton Synchrotron (PS) entered operation, rivalled by activities at Brookhaven’s higher-energy Alternating Gradient Synchrotron in the US. Producing the neutrino beam was relatively straightforward: make a proton beam from the PS hit an internal target to produce pions and kaons, let them fly some distance during which they can produce neutrinos when they decay, then use an iron shielding to filter the remaining hadrons, such that only neutrinos and muons remain. Ensuring that a new generation of particle detectors would enable the study of neutrino-beam interactions proved a tougher challenge. 

CERN began with two small, 1 m-long heavy-liquid bubble chambers that used proton beams which struck an internal target inside the PS, hoping to see at least one neutrino event per day. It was nowhere near that. Unfortunately the target configuration had made the beams about 10 times less intense than expected, and in 1961 CERN’s nascent neutrino programme came to a halt. “It was a big disappointment,” recalls Don Cundy, who was a young scientist at CERN at the time. “Then, several months later, Brookhaven did the same experiment but this time they put the target in the right place, and they discovered that there were two neutrinos – the muon neutrino (νµ) and the electron neutrino (νe) – a great discovery for which Lederman, Schwartz and Steinberger received the Nobel prize some 25 years later.” 

Despite this setback, CERN Director-General Victor Weisskopf, along with his Director of Research Gilberto Bernardini and the CERN team, decided to embark on an even more ambitious setup. Employing Simon van der Meer’s recently proposed “magnetic horn” – a high-current, pulsed focusing device placed around the target – and placing the target in an external beam pipe increased the neutrino flux by about two orders of magnitude. In 1963 this opened a new series of neutrino experiments at CERN. They began with a heavy-liquid bubble chamber containing around 500 kg of freon and a spark-chamber detector weighing several tonnes, for which first results were presented at a conference in Siena that year. The bubble-chamber results were particularly impressive, recalls Cundy: “Even though the number of events was of the order of a few hundred, you could do a lot of physics: measure the elastic form factor of the nucleon, single pion production, the total cross section, search for intermediate weak bosons and give limits on neutral-current processes.” It was at that conference that André Lagarrigue of Orsay urged that bubble chambers were the way forward for neutrino physics, and proposed to build the biggest chamber possible: Gargamelle, named after a giantess from a fictional renaissance story.

Magnetic horn

Construction in France of the much larger Gargamelle chamber, 4.8 m long and containing 18 tonnes of freon, was quick, and by the end of 1970 the detector was receiving a beam of muon neutrinos from the PS. The Gargamelle collaboration consisted of researchers from seven European institutes: Aachen, Brussels, CERN, École Polytechnique Paris, Milan, LAL Orsay and University College London. In 1969 the collaboration had made a list of physics priorities. Following the results of CERN’s Heavy Liquid Bubble Chamber, which set new limits on neutrino-electron scattering and single-pion neutral-current (NC) processes, the search for actual NC events made it onto the list. However, it only placed eighth out of 10 science goals. That is quite understandable, comments Cundy: “People thought that the most sensitive way to look for NCs was the decay of a K0 meson into two muons or two electrons but that had a very low branching ratio, so if NCs existed it would be at a very small level. The first thing on the list for Gargamelle was in fact looking at the structure of the nucleon, to measure the total cross section and to investigate the quark model.” 

Setting priorities

After the discovery of the neutrino in 1956 by Reines and Cowan (CERN Courier July/August 2016 p17), the weak interaction became a focus of nuclear research. The unification of the electromagnetic and weak interactions by Salam, Glashow and Weinberg a decade later motivated experiments to look for the electroweak carriers: the W boson, which mediates charged-current interactions, and the Z boson associated with neutral currents. While the former were known to exist by means of β decay, the latter were barely thought of. Neutral currents started to become interesting in 1971, after Martinus Veltman and Gerard ’t Hooft proved the renormalisability of the electroweak theory. 

More than 60 years after first putting the neutrino to work, CERN’s neutrino programme continues to evolve

By that time, Gargamelle was running at full speed. Analysing the photographs that were taken every time the PS was pulsed to look for interesting tracks were CERN personnel (at the time often referred to as “scanning girls”) who essentially performed the role of a modern level-1 trigger. Interactions were divided into different classes depending on the number of particles involved (muons, hadrons, electron–positron pairs, even one or more isolated protons as well as isolated electrons and positrons). The leptonic NC process (νµ + eνµ + e) would give an event that consisted of a single energetic electron. Since the background was very low, it would be the smoking gun for NCs. However, the cross-section was also very low, with only one to nine events expected from the electroweak calculations. The energetic hadronic NC event (νµ + N νµ + X, with the respective process involving antiparticles if the reaction was triggered by an antineutrino beam) would consist only of several hadrons, in fact just like events produced by incoming high-energy neutrons.

Gargamelle scanning table

“When the first leptonic event was found in December 1972 we were convinced that NCs existed,” says Gargamelle member Donatella Cavalli from the University of Milan. “It was just one event but with very low background, so a lot of effort was put into the search for hadronic NC events and in the full understanding of the background. I was the youngest in my group and I remember spending the evenings with my colleagues scanning the films on special projectors, which allowed us to observe the eight views of the chamber. I proudly remember my travels to Paris, London and Brussels, taking the photographs of the candidate events found in Milan to be checked with colleagues from other groups.”

At a CERN seminar on 19 July 1973, Paul Musset, who was one of the principal investigators, presented Gargamelle’s evidence for NCs based on both the leptonic and hadronic analyses. Results from the former had been published in a short paper received by Physics Letters two weeks earlier, while the paper on the hadronic events, which reported on the actual observation and hence confirmation of neutral currents, was received on 23 July. In August 1973 Gerald Myatt of  University College London, now at the University of Oxford, presented the results at the Electron-Photon conference. The papers were published in the same issue of the journal on 3 September. Yet many physicists doubted them. “It was generally believed that Gargamelle made a mistake,” says Myatt. “There was only one event, a tiny track really, and very low background. Still, it was not seen as conclusive evidence.” Among the critical voices were T D Lee, who was utterly unimpressed, and Jack Steinberger, who went as far as to bet half his wine cellar that the Gargamelle result would be wrong. 

The difficulty was to demonstrate that the hadronic NC signal was not due to background from neutral hadrons. “A lot of work and many different checks were done, from calculations to a full Monte Carlo simulation to a comparison between spatial distributions of charged- and neutral-current events,” explains Cavalli. “We were really happy when we published the first results from hadronic and leptonic NCs after all background checks, because we were confident in our results.” Initially the Gargamelle results were confirmed by the independent HPWF (Harvard–Pennsylvania–Wisconsin–Fermilab) experiment at Fermilab. Unfortunately, a problem with the HPWF setup led to their paper being rewritten, and a new analysis presented in November 1973 showed no sign of NCs. It was not until the following year that the modified HPWF apparatus and other experiments confirmed Gargamelle’s findings. 

André Lagarrigue

Additionally, the collaboration managed to tick off number two on its list of physics priorities: deep-inelastic scattering and scaling. Confirming earlier results from SLAC which showed that the proton is made of point-like constituents, Gargamelle data were crucial in proving that these constituents (quarks) have charges of +2/3 and –1/3. For neutral currents, the icing on the cake came 10 years after Gargamelle’s discovery with the direct discovery of the Z (and W) bosons at the SppS collider in 1983. The next milestone for CERN in understanding weak interactions came in 1990 with the precise measurement of the decay width of the Z boson at LEP, which showed that there are three and no more light neutrinos.

Legacy of a giantess

In 1977 Gargamelle was moved from the PS to the newly installed Super Proton Synchrotron (SPS). The following year, however, metal fatigue caused the chamber to crack and the experiment was decommissioned. Some of the collaboration members – including Cundy and Myatt – went to work on the nearby Big European Bubble Chamber. Also hooked up to the SPS for neutrino studies at that time were CDHS (CERN–Dortmund–Heidelberg–Saclay, officially denoted WA1) led by Steinberger, and Klaus Winter’s CHARM experiment. Operating for eight years, these large detectors collected millions of events that enabled precision studies on the structure of the charged and neutral currents as well as the structure of nucleons and the first evidence for QCD via scaling violations. 

The third type

The completion of the CHARM programme in 1991 marked the halt of neutrino operations at CERN for the first time in almost 30 years. But not for long. Experimental activities restarted with the search for neutrino oscillations, driven by the idea that neutrinos were an important component of dark matter in the universe. Consequently, two similarly styled short-baseline neutrino-beam experiments – CHORUS and NOMAD – were built. These next-generation detectors, which took data from 1994 to 1998 and from 1995 to 1998, respectively, joined others around the world to look for interactions of the third neutrino type, the ντ, and to search for neutrino oscillations, i.e. the change in neutrino flavour as they propagate, which was proposed in the 1950s and confirmed in 1998 by the SNO and Super-Kamiokande experiments in Canada and Japan. In 2000 the DONUT experiment at Fermilab reported the first direct evidence for ντ interactions. 

Gargamelle bubble chamber

CERN’s neutrino programme entered a hiatus until July 2006, when the SPS began firing an intense beam of muon neutrinos 732 km through Earth to two huge detectors – ICARUS and OPERA – located underground at Gran Sasso National Laboratory in Italy. Designed to make precision measurements of neutrino oscillations, the CERN Neutrinos to Gran Sasso (CNGS) programme observed the oscillation of muon neutrinos into tau neutrinos and was completed in 2012. 

As the CERN neutrino-beam programme was wound down, a brand-new initiative to support fundamental neutrino research began. “The initial idea for a ‘neutrino platform’ at CERN was to do a short-baseline neutrino experiment involving ICARUS to check the LSND anomaly, and another to test prototypes for “LBNO”, which would have been a European long-baseline neutrino oscillation experiment sending beams from CERN to Phyäsalmi in Finland to investigate the oscillation,” says Dario Autiero, who has been involved in CERN’s neutrino programme since the beginning of the 1980s. “The former was later decided to take place at Fermilab, while for the latter the European and US visions for long-baseline experiments found a consensus for what is now DUNE (the Deep Underground Neutrino Experiment) in the US.”

A unique facility

Officially launched in 2013 in scope of the update to the European strategy for particle physics, the CERN Neutrino Platform serves as a unique R&D facility for next-generation long-baseline neutrino experiments. Its most prominent project is the design, construction and testing of prototype detectors for DUNE, which will see a neutrino beam from Fermilab sent 1300 km to the SURF laboratory in Dakota. One of the Neutrino Platform’s early successes was the refurbishment of the ICARUS detector, which is now taking data at Fermilab’s short-baseline neutrino programme. The platform is also developing key technologies for the near detector for the Tokai-to-Kamioka (T2K) neutrino facility in Japan (see p10), and has a dedicated theory working group aimed at strengthening the connections between CERN and the worldwide neutrino community. Independently, the NA61 experiment at the SPS is contributing to a better understanding of neutrino–nucleon cross sections for DUNE and T2K data. 

Neutrino Platform at CERN’s North Area

More than 60 years after first putting the neutrino to work, CERN’s neutrino programme continues to evolve. In April 2023 a new experiment at the LHC called FASER made the first observation of neutrinos produced at a collider. Together with another new experiment, SND@LHC, FASER will enable the study of neutrinos in a new energy range and compare the production rate of all three types of neutrinos to further test the Standard Model. 

As for Gargamelle, today it lies next to BEBC and other retired colleagues in the garden of Square van Hove behind CERN’s main entrance. Not many can still retell the story of the discovery of neutral currents, but those who can share the story with delight “It was very tiny that first track from the electron, one in hundreds of thousands of pictures,” says Myatt. “Yet it justified André Lagarrigue’s vision of the large heavy-liquid bubble chamber as an ideal detector of neutrinos, combining large mass with a very finely detailed picture of the interaction. There can be no doubt that it was these features that enabled Gargamelle to make one of the most significant discoveries in the history of CERN.”

An insight into the European Spallation Source

By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

The European Spallation Source (ESS) is a European project with 13 members states and two host states. In this talk, Mats Lindroos will give examples of the science that will be done at ESS both in applied physics and fundamental physics. He will speak about the in-kind model, which made it possible to build this facility on a greenfield site in a country without any previous experience of much of the required technology.

Also reviewed will be the status of the project with beam on target planned for 2025 and the start of the full user programme in 2027.

Want to learn more on this subject?

Mats Lindroos has a PhD in subatomic physics from Chalmers University of technology in Gothenburg, Sweden, and since 2014, is adjunct professor at Lund’s university. He worked at CERN from 1993–2009 starting as a research fellow at the ISOLDE facility and from 1995 as a staff member in the CERN accelerator sector. He has among other tasks been responsible for PS Booster operation and technical coordination of the CERN ISOLDE facility. He has also been project leader of several CERN projects and had leading roles in several EC-supported design studies for future nuclear physics and neutrino facilities. Mats co-authored a book in 2009 on a future neutrino beam concept, beta-beams. Since 2009 he has been head of the accelerator division and sub-project leader at the European Spallation Source ERIC (ESS) in Lund.

Proton structure consists of three distinct regions

Researchers at Jefferson lab in the US have gained a deeper understanding of the role of gluons in providing mass to visible matter. Based on measurements of the photoproduction of J/ψ particles, the findings suggest that the proton’s structure has three distinct regions, with an inner core driven by gluonic interactions making up most of its mass.

Although the charge and spin of the proton have been extensively studied for decades, relatively little is known about its mass distribution. This is because gluons, which despite being massless provide a sizeable contribution to the proton’s mass, are neutral and thus cannot be studied directly using electromagnetic probes. The Jefferson team instead used the gluonic gravitational form factors (GFFs). Similar to electromagnetic form factors, which provide information about a hadron’s charge and magnetisation distributions, the GFFs (technically the matrix elements of the proton’s energy–momentum tensor) encode mechanical properties of the proton such as its mass, density, pressure and shear distributions.

To access the GFFs, the team measured the threshold cross section of exclusive J/ψ photoproduction at different energies by forcing photons with energies between 9.1 and 10.6 GeV to interact with a liquid hydrogen target. Gluons dominate the production of J/ψ at small momentum transfer since J/ψ mesons share no valence quarks with the proton. Due to the J/ψ’s vector quantum numbers, this process can occur at certain energies by gluons in scalar (dilaton-like) and tensor (graviton-like) states. The researchers fed their cross-section results into QCD models describing the gluonic GFFs and extracted the parameters defining the GFFs, enabling them to deduce one mass radius and one scalar radius.

We need a new generation of high-precision J/ψ experiments to get a better picture

Zein-Eddine Meziani

The analysis revealed a scalar proton radius of 1 fm, which is substantially larger than both the charge radius (around 0.85 fm) and the proton mass radius (0.75 fm). This led the team to propose that the proton structure consists of three distinct regions: an inner core that makes up most of the mass radius and is dominated by the tensor gluonic field structure, followed by the charge radius resulting from the relativistic motion of quarks, all enveloped in a larger confining scalar gluon density.

“Given that the proton’s scalar gluon radius is the largest we need to understand how this converts to our understanding of the gluonic structure of nuclei. For example, what would be the scalar radius of 4He compared to its charge radius?” says study leader Zein-Eddine Meziani of Argonne. The team plans to extend its studies to include the J/ψ muon final state decay, doubling the statistics of the current measurement, and to extract the gluon pressure distribution. “It is hard to say much right now, but this is a field in its infancy and the direct role of gluons in nuclei is not well understood,” adds Meziani. “We need a new generation of high-precision J/ψ experiments to get a better picture.”

Cold atoms for new physics

atom_interferometry_workshop_2023

On 13 and 14 March CERN hosted an international workshop on atom interferometry and the prospects for future large-scale experiments employing this quantum-sensing technique. The workshop had some 300 registered participants, of whom about half participated in person. As outlined in a keynote introductory colloquium by Mark Kasevich (Stanford), one of the pioneers of the field, this quantum sensing technology holds great promise for making ultra-sensitive measurements in fundamental physics. Like light interferometry, atom interferometry involves measuring interference patterns, but between atomic wave packets rather than light waves. Interactions between coherent waves of ultralight bosonic dark matter and Standard Model particles could induce an observable shift in the interference phase, as could the passage of gravitational waves.

Atom interferometry is a well-established concept that can provide exceptionally high sensitivity, e.g., to inertial/gravitational effects. Experimental designs take advantage of features used by state-of-the-art atomic clocks in combination with established techniques for building inertial sensors. This makes atom interferometry an ideal candidate to hunt for physics beyond the Standard Model such as waves of ultralight bosonic dark matter, or to measure gravitational waves in a frequency range around 1 Hz that is inaccessible to laser interference experiments on Earth, such as LIGO, Virgo and KAGRA, or the upcoming space-borne experiment LISA. As discussed during the workshop, measurements of gravitational waves in this frequency range could reveal mergers of black holes with masses intermediate between those accessible to laser interferometers, casting light on the formation of the supermassive black holes known to inhabit the centres of galaxies. Atom interferometer experiments can also explore the limits of quantum mechanics and its interface with gravity, for example by measuring a gravitational analogue of the Aharonov-Bohm effect.

A deep shaft at Point 4 of the LHC is a promising location for an atom interferometer with a vertical baseline of over 100 m

Although the potential of atom interferometers for fundamental scientific measurements was the principal focus of the meeting, it was also emphasised that technologies based on the same principles also have wide-ranging practical applications. These include gravimetry, geodesy, navigation, time-keeping and Earth observation from space, providing, for example, a novel and sensitive technique for monitoring the effects of climate change through measurements of the Earth’s gravitational field.

Several large atom interferometers with a length of 10m already exist, for example at Stanford University, or are planned, for example in Hanover (VLBAI), Wuhan and at Oxford University (AION). However, many of the proposed physics measurements require next-generation setups with a length of 100m, and such experiments are under construction at Fermilab (MAGIS), in France (MIGA) and in China (ZAIGA). The Atomic Interferometric Observatory and Network (AION) collaboration is evaluating possible sites in the UK and at CERN. In this context, a recent conceptual feasibility study supported by the CERN Physics Beyond Colliders study group concluded that a deep shaft at Point 4 of the LHC is a promising location for an atom interferometer with a vertical baseline of over 100 m. The March workshop provided a forum for discussing such projects, their current status, future plans and prospective sensitivities.

Looking further ahead, participants discussed the prospects for one or more km-scale atom interferometers, which would provide the maximal sensitivity possible with a terrestrial experiment to search for ultralight dark matter and gravitational waves. It was agreed that the global community interested in such experiments would work together towards establishing an informal proto-collaboration that could develop the science case for such facilities, provide a forum for exchanging ideas how to develop the necessary technological advances and develop a roadmap for their realisation.

A highlight of the workshop was a poster session that provided an opportunity for 30 early-career researchers to present their ideas and current work on projects exploiting the quantum properties of cold atoms and related topics. The liveliness of this session showed how this interdisciplinary field at the boundaries between atomic physics, particle physics, astrophysics and cosmology is inspiring the next generation of researchers. These researchers may form the core of the team that will lead atom interferometers to their full potential.

ATLAS increases precision on W mass

Latest ATLAS measurement

Since the discovery of the W boson at the SppS 40 years ago, collider experiments at CERN and elsewhere have measured its mass ever more precisely. Such measurements provide a vital test of the Standard Model’s consistency, since the W mass is closely related to the strength of the electroweak interaction and to the masses of the Z boson, top quark and Higgs boson; higher experimental precision is needed to keep up with the most recent electroweak calculations. 

The latest experiment to weigh in on the W mass is ATLAS. Reanalysing a sample of 14 million W candidates produced in proton–proton collisions at 7 TeV, the collaboration finds Mw = 80.360 ± 0.005(stat) ± 0.015(syst) = 80.360 ±0.016 GeV. The value, which was presented on 23 March at the Rencontres de Moriond, is in agreement with all previous measurements except one – the latest measurement from the CDF experiment at the former Tevatron collider at Fermilab.

In 2017 ATLAS released its first measurement of the W-boson mass, which was determined using data recorded in 2011 when the LHC was running at a collision energy of 7 TeV (CERN Courier January/February 2017 p10). The precise result (80.370 ±0.019 GeV) agreed with the Standard Model prediction (80.354 ±0.007 GeV) and all previous experimental results, including those from the LEP experiments. But last year, the CDF collaboration announced an even more precise measurement, based on an analysis of its full dataset (CERN Courier May/June 2022 p9). The result (80.434 ±0.009 GeV) differed significantly from the Standard Model prediction and from the other experimental results (see figure), calling for more measurements to try to identify the source of the discrepancy. 

In its new study, ATLAS reanalysed its 2011 data sample using a more advanced fitting technique as well as improved knowledge of the parton distribution functions that describe how the proton’s momentum is shared amongst its constituent quarks and gluons. In addition, the collaboration verified the theoretical description of the W-production process using dedicated LHC proton–proton runs. The new result is 10 MeV lower than the previous ATLAS result and 15% more precise. 

“Due to an undetected neutrino in the particle’s decay, the W-mass measurement is among the most challenging precision measurements performed at hadron colliders. It requires extremely accurate calibration of the measured particle energies and momenta, and a careful assessment and excellent control of modelling uncertainties,” says ATLAS spokesperson Andreas Hoecker. “This updated result from ATLAS provides a stringent test and confirms the consistency of our theoretical understanding of electroweak interactions.” 

The LHCb collaboration reported a measurement of the W mass in 2021, while the results from CMS are keenly anticipated. In the meantime, physicists from the Tevatron+LHC W-mass combination working group are calculating a combined mass value using the latest measurements from the LHC, Tevatron and LEP. This involves a detailed investigation of higher-order theoretical effects affecting hadron-collider measurements, explains CDF representative Chris Hays from the University of Oxford: “The aim is to give a comprehensive and quantitative overview of W-boson mass measurements and their compatibilities. While no significant issues have been identified that significantly change the measurement results, the studies will shed light on their details and differences.”

Searching for dark photons in beam-dump mode

NA62 detector

Faced with the no-show of phenomena beyond the Standard Model at the high mass and energy scales explored so far by the LHC, it has recently become a much considered possibility that new physics hides “in plain sight”, namely at mass scales that can be very easily accessed but at very small coupling strengths. If this were the case, then high-intensity experiments have an advantage: thanks to the large number of events that can be generated, even the most feeble couplings corresponding to the rarest processes can be accessible.

Such a high-intensity experiment is NA62 at CERN’s North Area. Designed to measure the ultra-rare kaon decay K → πνν, it has also released several results probing the existence of weakly coupled processes that could become visible in its apparatus, a prominent example being the decay of a kaon into a pion and an axion. But there is also an unusual way in which NA62 can probe this kind of physics using a configuration that was not foreseen when the experiment was planned, for which the first result was recently reported. 

During normal NA62 operations, bunches of 400 GeV protons from the SPS are fired onto a beryllium target to generate secondary mesons from which, using an achromat, only particles with a fixed momentum and charge are selected. These particles (among them kaons) are then propagated along a series of magnets and finally arrive at the detector 100 m downstream. In a series of studies starting in 2015, however, NA62 collaborators with the help of phenomenologists began to explore physics models that could be tested if the target was removed and protons were fired directly into a “dump” that can be arranged by moving the achromat collimators. They concluded that various processes exist in which new MeV-scale particles such as dark photons could be produced and detected from their decays into di-lepton final states. The challenge is to keep the muon-induced background under control, which cannot be easily understood from simulations alone. 

A breakthrough came in 2018 when beam physicists in the North Area understood how the beamline magnets could be operated in such a way as to vastly reduce the background of both muons and hadrons. Instead of using the two pairs of dipoles as a beam achromat for momentum selection, the currents in the second pair are set to induce additional muon sweeping. The scheme was verified during a 2021 run lasting 10 days, during which 1.4 × 1017 protons were collected on the beam dump. The first analysis of this rapidly collected dataset – a search for dark photons decaying to a di-muon final state – has now been performed.

Hypothesised to mediate a new gauge force, dark photons, A′, could couple to the Standard Model via mixing with ordinary photons. In the modified NA62 set-up, dark photons could be produced either via bremsstrahlung or decays of secondary mesons, the mechanisms differing in their cross-sections and distributions of the momenta and angles of the A′. No sign of A′ → μ+μ was found, excluding a region of parameter space for dark-photon masses between 215 and 550 MeV at 90% confidence. A preliminary result for a search for A′ → e+e was also presented at the Rencontres de Moriond in March.

“This result is a milestone,” explains analysis leader Tommaso Spadaro of LNF Frascati. “It proves the capability of NA62 for studying physics in the beam-dump configuration and paves the way for upcoming analyses checking other final states.” 

X-ray source could reveal new class of supernovae

Large Magellanic Cloud

Type1A supernovae play an important role in the universe, both as the main source of iron and as one of the principal tools for astronomers to measure cosmic-distance scales. They are also important for astroparticle physics, for example allowing the properties of the neutrino to be probed in an extreme environment.

Type1A supernovae make ideal cosmic rulers because they all look very similar, with roughly equal luminosity and emission characteristics. Therefore, when a cosmic explosion that matches the properties of a type1A supernova is detected, its luminosity can be directly used to measure the distance to its host galaxy. Despite this importance, the details surrounding the progenitors of these events are still not fully understood. Furthermore, a group of outliers, now known as type1ax events, has recently been identified that indicate there might be more than one path towards a type1A explosion.

The reason that typical type1A events all have a roughly equal luminosity is because of their progenitors. The general explanation for these events includes a binary system with at least one white dwarf: a very dense old star consisting mostly of oxygen and carbon that is not undergoing fusion. The system is only prevented from collapsing into a neutron star or black hole due to electron-degeneracy pressure. As the white dwarf accumulates matter from a nearby companion, its mass increases to a precise critical limit at which an uncontrolled thermonuclear explosion starts, resulting in the star being unbounded and seen as the supernova.

This peculiar binary system provides strong hints of a new type of progenitor that can explain up to 30% of all supernovae 1a events

As several X-ray sources were identified in the 1990s by the ROSAT mission as being white dwarfs with hydrogen burning on their surface, the source of matter that is accumulated by the white dwarf was long thought to be hydrogen from a companion star. The flaw with this model, however, is that type1A supernovae show no signs of any hydrogen. On the other hand, helium has been seen, particularly in the outlier type1ax supernovae events. These 1ax events, which are predicted to make up 30% of all type1A events, can be explained by a white dwarf accumulating helium from a companion star that has already shed all of its hydrogen. If the helium was able to accumulate on the surface in a stable way, without intermediate explosions due to violent ignition of the helium, it reaches a mass where it violently ignites on the surface. This in turn triggers the ignition of the core and could explain the type1ax events. Evidence of helium accumulating white dwarfs has, however, not been found.

Now, a group led by researchers from the Max Planck Institute for Extraterrestrial Physics (MPE) has used both optical data and X-ray data from the eROSITA and XMM Newton missions to find the first clear evidence of such a progenitor system. The group found an object, known as [HP99] 159, located in the Large Magellanic Cloud, which shows all the characteristics of a white dwarf surrounded by an accretion disk of helium. Using historical X-ray data from as far back as 50 years, the team also showed that the brightness of the source is relatively stable, thereby indicating that it is accumulating the helium at a stable rate, despite the accumulation rate being lower than theoretically predicted for stable burning. This indicates that the system is working its way towards ignition in the future.

The discovery of this new X-ray source therefore proves the existence of white dwarfs that accumulate helium from a companion star at a steady rate, thereby allowing them to reach the conditions to produce a supernova. This peculiar binary system already provides strong hints of a new type of progenitor that can explain up to 30% of all supernovae 1a events. Follow-up measurements will provide further insight into the complex physics at play in the thermonuclear explosions that produce these events, while [HP99] 159’s characteristics can be used to find similar sources.

First collider neutrinos detected

Electron neutrino charged-current interaction

Since their discovery 67 years ago, neutrinos from a range of sources – solar, atmospheric, reactor, geological, accelerator and astrophysical – have provided ever more powerful probes of nature. Although neutrinos are also produced abundantly in colliders, until now no neutrinos produced in such a way had been detected, their presence inferred instead via missing energy and momentum. 

A new LHC experiment called FASER, which entered operations at the start of Run 3 last year, has changed this picture with the first observation of collider neutrinos. Announcing the result on 19 March at the Rencontres de Moriond, and in a paper submitted to Physical Review Letters on 24 March, the FASER collaboration reconstructed 153 candidate muon neutrino and antineutrino interactions in its spectrometer with a significance of 16 standard deviations above the background-only hypothesis. Being consistent with the characteristics expected from neutrino interactions in terms of secondary-particle production and spatial distribution, the results imply the observation of both neutrinos and antineutrinos with an incident neutrino energy significantly above 200 GeV. In addition, an ongoing analysis of data from an emulsion/tungsten subdetector called FASERν revealed a first electron–neutrino interaction candidate (see image). 

“FASER has directly observed the interactions of neutrinos produced at a collider for the first time,” explains co-spokesperson Jamie Boyd of CERN. “This result shows the detector worked perfectly in 2022 and opens the door for many important future studies with high-energy neutrinos at the LHC.” 

The extreme luminosity of proton–proton collisions at the LHC produces a large neutrino flux in the forward direction, with energies leading to cross-sections high enough for neutrinos to be detected using a compact apparatus. FASER is one of two new forward experiments situated at either side of LHC Point 1 to detect neutrinos produced in proton–proton collisions in ATLAS. The other, SND@LHC, also reported its first results at Moriond. The team found eight muon–neutrino candidate events against an expected background of 0.2, with an evaluation of systematic uncertainties ongoing. 

Covering energies between a few hundred GeV and several TeV, FASER and SND@LHC narrow the gap between fixed-target and astrophysical neutrinos. One of the unexplored physics topics to which they will contribute is the study of high-energy neutrinos from astrophysical sources. Since the production mechanism and energy of neutrinos at the LHC is similar to that of very-high-energy neutrinos from cosmic-ray collisions with the atmosphere, FASER and SND@LHC can be used to precisely estimate this background. Another application is to measure and compare the production rate of all three types of neutrinos, providing an important test of the Standard Model.

Beyond neutrinos, the two experiments open new searches for feebly interacting particles and other new physics. In a separate analysis, FASER presented first results from a search for dark photons decaying to an electron-positron pair. No events were seen in an almost background-free analysis, yielding new constraints on dark photons with couplings of 10–5 to 10–4 and masses of between 10 and 100 MeV, in a region of parameter space motivated by dark matter. 

bright-rec iop pub iop-science physcis connect