Topics

Italy ramps up superconductor R&D

Developing high-temperature and high-magnetic-field superconducting technologies both for societal applications and next-generation particle accelerators is the goal of a new project in Italy called IRIS, launched in November and led by the INFN. IRIS (Innovative Research Infrastructure on applied Superconductivity) has received a €60 million grant from the Piano Nazionale di Ripresa e Resilienza to create a distributed R&D infrastructure throughout the country. It will focus on cables for low-loss electricity transport, and on the construction of superconducting magnets with high-temperature superconductors (HTS) in synergy with R&D for the proposed Future Circular Collider (FCC) at CERN. The project is estimated to last for 30 months, with more than 50% of the funds going to laboratories in the South of Italy.

One of the main objectives will be the construction in Salerno of a large infrastructure that will host not only a superconducting connection line, but also a centre of excellence for testing future industrial products for high-power connections, with the aim of making high-temperature superconductors less difficult and less expensive to work with. 

“With the IRIS project, Italy assumes a leading position in applied superconductivity, creating a real synergy between research institutions and universities, which will offer an important collaboration opportunity for particle physicists and those involved in the fields of superconductivity and magnetism,” explains IRIS technical coordinator Lucio Rossi of the University of Milan. “An aspect not to be overlooked is also the high educational value of the project, which will guarantee numerous doctoral and high-level training opportunities for about a 100 students, young researchers and technicians.”

The activities of IRIS will be coordinated by the Laboratory of Accelerators and Applied Superconductivity (LASA) in Milan, with many partners including the universities of Genova, Milano, Naples, Salento and Salerno, and the CNR Institute for Superconductors, Innovative Materials and Devices (SPIN). 

“IRIS is a virtuous example of how basic research, and in this case particle and accelerator physics, can provide an important application in other science areas, such as the development of new materials for energy saving that is essential for the creation of high-power cables without dissipation and suitable for the needs of future electricity networks serving new energy sources,” says Pierluigi Campana of INFN Frascati, IRIS scientific coordinator.

Preparing for post-LS3 scenarios

Proposed experimental programmes

The Physics Beyond Colliders (PBC) study was launched in 2016 to explore the opportunities offered by CERN’s unique accelerator and experimental-area complex to address some of the outstanding questions in particle physics through experiments that are complementary to the high-energy frontier. Following the recommendations of the 2020 update of the European strategy for particle physics, the CERN directorate renewed the mandate of the PBC study, continuing it as a long-term activity.

The fourth PBC annual workshop took place at CERN from 7 to 9 November 2022. The aim was to review the status of the studies, with a focus on the programmes under consideration for the start of operations after Long Shutdown 3 (LS3), scheduled for 2026–2029.

The North Area (NA) at CERN, where experiments are driven by beams from the Super Proton Synchrotron (SPS), is at the heart of many present and proposed explorations for physics beyond the Standard Model. The NA includes an underground cavern (ECN3), which can host unique high-energy/high-intensity proton beams. Several proposals for experiments have been made, all of which require higher intensity proton beams than are currently available. It is therefore timely to identify the synergies and implications of a future ECN3 high-intensity programme on the otherwise ongoing NA technical consolidation programme. 

The following proposals are being considered within the PBC study group:

• HIKE (High Intensity Kaon Experiment) is a proposed expansion of the current NA62 programme to study extremely rare decays of charged kaons and, in a second phase, those of neutral kaons. This would be complemented by searches for visible decays of feebly interacting particles (FIPs) that could emerge on-axis from the dump of an intense proton beam within a thick absorber that would contain all other known particles, except muons and neutrinos;

• SHADOWS (Search for Hidden And Dark Objects With the SPS) would search for visible FIP decays off-axis and could run in parallel to HIKE when operated in beam-dump mode. The proposed detector is compact and employs existing technologies to meet the challenges of reducing the muon background;

• SHiP (Search for Hidden Particles) would allow a full investigation of hidden sectors in the GeV mass range. Comprehensive design studies for SHiP and the Beam Dump Facility (BDF) in a dedicated experimental area were published in preparation for the European strategy update. During 2021, an analysis of alternative locations using existing infrastructure at CERN revealed ECN3 to be the most promising option;

• Finally, TauFV (Tau Flavour Violation) would conduct searches for lepton-flavour violating tau-lepton decays.

The HIKE, SHADOWS and BDF/SHiP collaborations have recently submitted letters of intent describing their proposals for experiments in ECN3. The technical feasibility of the experiments, their physics potential and implications for the NA consolidation are being evaluated in view of a possible decision by the beginning of 2023. A review of the experimental programme in the proposed high-intensity  facility will take place during 2023, in parallel with a detailed comparison of the sensitivity to FIPs in a worldwide context.

A vibrant programme

The NA could also host a vibrant ion-physics programme after LS3, with NA60++ aiming to measure the caloric curve of the strong-force phase transition with lead–ion beams, and NA61++ proposing to explore the onset of the deconfined nuclear medium, extending the scan in the momentum/ion space with collisions of lighter ion beams. The conceptual implementation of such schemes in the accelerators and experimental area is being studied and the results, together with the analysis of the physics potential, are expected during 2023.

The search for long-lived particles with dedicated experiments and the exploration of fixed-target physics is also open at the LHC. The proposed forward-physics facility, located in a cavern that could be built at a distance of 600 m along the beam direction from LHC Interaction Point 1, would take advantage of the large flux of high-energy particles produced in the very forward direction in LHC collisions. It is proposed to host a comprehensive set of detectors (FASER2, FASERν2, AdvSND, FORMOSA, FLArE) to explore a broad range of new physics and to study the highest energy neutrinos produced by accelerators. A conceptual design report of the facility, including detector design, background analysis and mitigation measures, civil engineering and integration studies is in preparation. Small prototypes of the MATHUSLA, ANUBIS and CODEX-b detectors aiming at the search for long-lived particles at large angles from LHC collisions are also being built for installation during the current LHC run.

The North Area at CERN is at the heart of many present and proposed explorations for physics beyond the Standard Model

A gas-storage cell (SMOG2) was installed in front of the LHCb experiment during the last LHC long shutdown, opening the way to high-precision fixed-target measurements at the LHC. The storage cell enhances the density of the gas and therefore the rate of the collisions by up to two orders of magnitude as compared to the previous internal gas target. SMOG2 has been successfully commissioned with neon gas, demonstrating that it can be operated in parallel to LHCb. Future developments include the injection of different types of gases and a polarised gas target to explore nucleon spin-physics at the LHC.

Crystal clear

Fixed-target experiments are also being developed that would extract protons from LHC beams by channelling the beam halo with a bent crystal.  The extracted protons would impinge on a target and be used for measurements of proton structure functions (“single crystal setup”) or estimation of the magnetic and electric dipole moments of short-lived heavy baryons (“double crystal setup”). In the latter case, the measurement would be based on the baryon spin precession in the strong electric field of a second bent crystal installed immediately downstream from the baryon-production proton target. A proof-of-principle experiment of the double-crystal setup is being designed for installation in the LHC to determine the channelling efficiency for long crystals at TeV energies, as well as to demonstrate the control and management of the secondary halo and validate the estimate of the achievable luminosity.

The technology know-how at CERN can also benefit non-accelerator experiments

The technology know-how and experience available at CERN can also benefit non-accelerator experiments such as the Atom Interferometer Observatory and Network (AION), proposed to be installed in one of the shafts at Point 4 of the LHC for mid-frequency gravitational-wave detection and ultra-light dark-matter searches, as well as the development of superconducting cavities for the Relic Axion Detector Experimental Setup (RADES) and for the heterodyne detection of axion-like particles.

During the workshop, progress on the possible applications of a gamma factory at CERN, as well as the status of the design of a Charged-Particle EDM Prototype Ring and of the R&D for novel monitored or tagged neutrino beamlines, were also presented.

Neutrinos reveal active galaxy’s inner depths

Highly energetic cosmic rays reach Earth from all directions and at all times, yet it has been challenging to conclusively identify their sources. Being charged, cosmic rays are easily deflected by interstellar magnetic fields during their propagation and thereby lose any information about where they originated from. On the other hand, highly energetic photons and neutrinos remain undeflected. Observations of high-energy photons and neutrinos are therefore crucial clues towards unravelling the mystery of cosmic-ray sources and accelerators.

Four years ago, the IceCube collaboration announced the identification of the blazar TXS 0506+056 as a source of high-energy cosmic neutrinos, the first of its kind (CERN Courier September 2018 p7). This was one of the early examples of multi-messenger astronomy wherein a high-energy neutrino event detected by IceCube, which was coincident in direction and time with a gamma-ray flare from the blazar, prompted an investigation into this object as a potential astrophysical neutrino source.

Point source

In the following years, IceCube made a full-sky scan for point-like neutrino sources, and in 2020, the collaboration found an excess coincident with the Seyfert II galaxy, NGC1068, that was inconsistent with a background-only hypothesis. However, with a statistical significance of only 2.9σ, it was insufficient to claim a detection. In November 2022, after a more detailed analysis with a longer live-time and improved methodologies, the collaboration confirmed NGC1068 to be a point source of high-energy neutrinos at a significance of 4.2σ.

IceCubes measurements usher in a new era of neutrino astronomy

Messier 77, also known as the Squid Galaxy or NGC1068, is located 47 million light years away in the constellation Cetus and was discovered in 1780. Today, we know it to be an active galaxy: at its centre lies an active galactic nucleus (AGN), which is a luminous and compact region powered by a super massive black hole (SMBH), surrounded by an accretion disk. Specifically, it is a Seyfert II galaxy, which is an active galaxy that is viewed edge-on, with the line-of-sight passing through the accretion region that obscures the centre.

The latest search used data from the fully completed IceCube detector. Several calibrations and alignments were also made to the data-acquisition procedure and an advanced event reconstruction algorithm was deployed. The search was conducted in the Northern Hemisphere of the sky, i.e. by detecting neutrinos from “below”, so that Earth could screen background atmospheric muons.

Three different searches were carried out to locate possible point-like neutrino sources. The first involved scanning the sky for a statistically significant excess over background, while the other two used a catalogue of 110 sources that was developed in the 2020 study, the difference between the two being the statistical methods used. The results showed an excess of 79+22–20  muon–neutrino events, with the main contribution coming from neutrinos in the energy range of 1.5 to 15 TeV, while the all-flavour flux is expected to be a factor of three higher. All the events contributing to the excess were well-reconstructed within the detector, with no signs of anomalies, and the results were found not to be dominated by just one or a few individual events. The results were also in line with phenomenological models that predict the production of neutrinos and gamma rays in sources such as NGC1068.

IceCube’s measurements usher in a new era of neutrino astronomy and take researchers a step closer to understanding not only the origin of high-energy cosmic rays but also the immense power of massive black holes, such as the one residing inside NGC1068.

Combining quantum with high-energy physics

From 1 to 4 November, the first International Conference on Quantum Technologies for High-Energy Physics (QT4HEP) was held at CERN. With 224 people attending in person and many more following online, the event brought together researchers from academia and industry to discuss recent developments and, in particular, to identify activities within particle physics that can benefit most from the application of quantum technologies.

Opening the event, Joachim Mnich, CERN director for research and computing, noted that CERN is widely recognised, including by its member states, as an important platform for promoting applications of quantum technologies for both particle physics and beyond. “The journey has just begun, and the road is still long,” he said, “but it is certain that deep collaboration between physicists and computing experts will be key in capitalising on the full potential of quantum technologies.”

The conference was organised by the CERN Quantum Technology Initiative (CERN QTI), which was established in 2020, and followed a successful workshop on quantum computing in 2018 that marked the beginning of a range of new investigations into quantum technologies at CERN. CERN QTI covers four main research areas: quantum theory and simulation; quantum sensing, metrology and materials; quantum computing and algorithms; and quantum communication and networks. The first day’s sessions focused on the first two: quantum theory and simulation, as well as quantum sensing, metrology and materials. Topics covered included the quantum simulation of neutrino oscillations, scaling up atomic interferometers for the detection of dark matter, and the application of quantum traps and clocks to new-physics searches.

Building partnerships

Participants showed an interest in broadening collaborations related to particle physics. Members of the quantum theory and quantum sensing communities discussed ways to identify and promote areas of promise relevant to CERN’s scientific programme. It is clear that many detectors in particle physics can be enhanced – or even made possible – through targeted R&D in quantum technologies. This fits well with ongoing efforts to implement a chapter on quantum technologies in the European Committee for Future Accelerators’ R&D roadmap for detectors, noted Michael Doser, who coordinates the branch of CERN QTI focused on sensing, metrology and materials.

For the theory and simulation branch of CERN QTI, the speakers provided a useful overview of quantum machine learning, quantum simulations of high-energy collider events and neutrino processes, as well as making quantum-information studies of wormholes testable on a quantum processor. Elina Fuchs, who coordinates this branch of CERN QTI, explained how quantum advantages have been found for toy models of increased physical relevance. Furthermore, she said, developing a dictionary that relates interactions at high energies to lower energies will enhance knowledge about new-physics models learned from quantum-sensing experiments.

The conference demonstrated the clear potential of different quantum technologies to impact upon particle-physics research

The second day’s sessions focused on the remaining two areas, with talks on quantum-machine learning, noise gates for quantum computing, the journey towards a quantum internet, and much more. These talks clearly demonstrated the importance of working in interdisciplinary, heterogeneous teams when approaching particle-physics research with quantum-computing techniques. The technical talks also showed how studies on the algorithms are becoming more robust, with a focus on trying to address problems that are as realistic as possible.

A keynote talk from Yasser Omar, president of the Portuguese Quantum Institute, presented the “fleet” of programmes on quantum technologies that has been launched since the EU Quantum Flagship was announced in 2018. In particular, he highlighted QuantERA, a network of 39 funding organisations from 31 countries; QuIC, the European Quantum Industry Consortium; EuroQCI, the European Quantum Communication Infrastructure; EuroQCS, the European Quantum Computing and Simulation Infrastructure; and the many large national quantum initiatives being launched across Europe. The goal, he said, is to make Europe autonomous in quantum technologies, while remaining open to international collaboration. He also highlighted the role of World Quantum Day – founded in 2021 and celebrated each year on 14 April – in raising awareness around the world of quantum science.

Jay Gambetta, vice president of IBM Quantum, gave a fascinating talk on the path to quantum computers that exceed the capabilities of classical computers. “Particle physics is a promising area for looking for near-term quantum advantage,” he said. “Achieving this is going to take both partnership with experts in quantum information science and particle physics, as well as access to tools that will make this possible.”

Industry and impact

The third day’s sessions – organised in collaboration with CERN’s knowledge transfer group – were primarily dedicated to industrial co-development. Many of the extreme requirements faced by quantum technologies are shared with particle physics, such as superconducting materials, ultra-high vacuum, precise timing, and much more. For this reason, CERN has built up a wealth of expertise and specific technologies that can directly address challenges in the quantum industry. CERN strives to maximise the impact of all of its technologies and know-how on society in many ways to ease the transfer of CERN’s knowledge to industry and society. One focus is to see which technologies might help to build robust quantum-computing devices. Already, CERN’s White Rabbit technology, which provides sub-nanosecond accuracy and picosecond precision of synchronisation for the LHC accelerator chain, has found its way to the quantum community, noted Han Dols, business development and entrepreneurship section leader.

Several of the day’s talks focused on challenges around trapped ions and control systems. Other topics covered included the potential of quantum computing for drug development, measuring brain function using quantum sensors, and developing specialised instrumentation for quantum computers. Representatives of several start-up companies, as well as from established technology leaders, including Intel, Atos and Roche, spoke during the day. The end of the third day was dedicated to crucial education, training and outreach initiatives. Google provided financial support for 11 students to attend the conference, and many students and researchers presented posters.

Marieke Hood, executive director for corporate affairs at the Geneva Science and Diplomacy Anticipator (GESDA) foundation, also gave a timely presentation about the recently announced Open Quantum Institute (OQI). CERN is part of a coalition of science and industry partners proposing the creation of this institute, which will work to ensure that emerging quantum technologies tackle key societal challenges. It was launched at the 2022 GESDA Summit in October, during which CERN Director-General Fabiola Gianotti highlighted the potential of quantum technologies to help achieve key UN Sustainable Development Goals. “The OQI acts at the interface of science and diplomacy,” said Hood. “We’re proud to count CERN as a key partner for OQI, its experience of multinational collaboration will be most useful to help us achieve these ambitions.”

The final day of the conference was dedicated to hands-on workshops with three different quantum-computing providers. In parallel, a two-day meeting of the “Quantum Computing 4HEP” working group, organised by CERN, DESY and the IBM Quantum Network, took place.

Qubit by qubit

Overall, the QT4HEP conference demonstrated the clear potential of different quantum technologies to impact upon particle-physics research. Some of these technologies are here today, while others are still a long way off. Targeted collaboration across disciplines and the academia–industry interface will help ensure that CERN’s research community is ready to maximise on the potential of these technologies.

“Widespread quantum computing may not be here yet, but events like this one provide a vital platform for assessing the opportunities this breakthrough technology could deliver for science,” said Enrica Porcari, head of the CERN IT department. “Through this event and the CERN QTI, we are building on CERN’s tradition of bringing communities together for open discussion, exploration, co-design and co-development of new technologies.”

Playing in the sandbox of geometry

Maryna Viazovska

When did you first know you had a passion for pure mathematics? 

I have had a passion for mathematics since my first year in school. At that time I did not realise what “pure mathematics” was, but maths was my favourite subject from a very early age.

What is number theory, in terms that a humble particle physicist can understand?

In fact, “number theory” is not well defined and any interesting question about numbers, geometric shapes and functions can be seen as a question for a number theorist.

What motivated you to work on sphere-packing? 

I think it is a beautiful problem, something that can be easily explained. Physicists know what a Euclidean space and a sphere are, and everybody knows the problem from stacking oranges or apples. What is a bit harder to explain is that mathematicians are not trying to model a particular physical situation. Mathematicians are not bound to phenomena in nature to justify their work, they just do it. We do not need to model any physical situation, which is a luxury. The work could have an accidental application, but this is not the primary goal. Physicists, especially theorists, are used to working in multi-dimensional spaces. At the same time, these dimensions have a special interpretation in physics. 

What fascinates you most about working on theoretical rather than applied mathematics?

My motivation often comes out of curiosity and my belief that the solutions to the problems will become useful at some point in the future. But it is not my job to judge or to define the usefulness. My belief is that the fundamental questions must be answered, so that other people can use this knowledge later. It is important to understand the phenomena in mathematics and in science in general, and the possibility of discovering something that other people have not yet. Maybe it is possible to come up with other ideas for detectors, which become interesting. When I look at physics detectors, for example, it fascinates me how complex these machines are and how many tiny technical solutions must be invented to make it all work. 

How did you go about cracking the sphere-stacking problem? 

I think there was an element of luck that I could find the correct idea to solve this problem because many people worked on it before. I was fortunate to find the right solution. The initial problem came from geometry, but the final solution came from Fourier analysis, via a method called linear programming. 

I think a mathematical reality exists on its own and sometimes it does describe actual physical phenomena

In 2003, mathematicians Henry Cohn and Noam Elkies applied the linear programming method to the sphere-packing problem and numerically obtained a nearly optimal upper bound in dimensions 8 and 24. Their method relied on constructing an auxiliary, “magic”, function. They computed this function numerically but could not find an explicit formula for it. My contribution was to find the explicit formula for the magic function.

What applications does your work have, for example in quantum gravity? 

After I solved the sphere-packing problem in dimension 8 in 2016, CERN physicists worked on the relation between two-dimensional conformal field theory and quantum gravity. From what I understand, conformal field theories are mathematically totally different from sphere-packing problems. However, if one wants to optimise certain parameters in the conformal field theory, physicists use a method called “bootstrap”, which is similar to the linear programming that I used. The magic functions I used to solve the sphere-packing problem were independently rediscovered by Thomas Hartman, Dalimil Mazác and Leonardo Rastelli.

Are there applications beyond physics?

One of the founders of modern computer science, Claude Shannon, realised that sphere-packing problems are not only interesting geometric problems that pure mathematicians like me can play with, but they are also a good model for error-correcting codes, which is why higher-dimensional sphere packing problems became interesting for mathematicians. A very simplified version of the original model could be the following. An error is introduced during the transmission of a message. Assuming the error is under control, the corrupted message is still close to the original message. The remedy is to select different versions of the messages called codewords, which we think are close to the original message but at the same time far away from each other, so that they do not mix with each other. In geometric language, this situation is an exact analogy of sphere-packing, where each code word represents the centre of the sphere and the sphere around the centre represents the cloud of possible errors. The spheres will not intersect if their centres are far away from each other, which allows us to decode the corrupted message.  

Do you view mathematics as a tool, or a deeper property of reality?

Maybe it is a bit idealistic, but I think a mathematical reality exists on its own and sometimes it does describe actual physical phenomena, but it still deserves our attention if not. In our mathematical world, we have chances to realise that something from this abstract mathematical world is connected to other fields, such as physics, biology or computer science. Here I think it’s good to know that the laws of this abstract world often provide us with useful gadgets, which can be used later to describe the other realities. This whole process is a kind of “spiral of knowledge” and we are in one of its turns.

Hunting dark matter with invisible Higgs decays

CMS figure 1

In the Standard Model (SM) of particle physics, the only way the Higgs boson can decay without leaving any traces in the LHC detectors is through the four-neutrino decay, H  ZZ  4ν, which has an expected branching fraction of only 0.1%. This very small value can be seen as a difficulty but is also an exciting opportunity. Indeed, several theories of physics beyond the SM predict considerably enhanced values for the branching fraction of invisible Higgs-boson decays. In one of the most interesting scenarios, the Higgs boson acts as a portal to the dark sector by decaying to a pair of dark matter (DM) particles. Measurements of the “Higgs to invisible” branching fraction are clearly among the most important tools available to the LHC experiments in their searches for direct evidence of DM particles.

The CMS collaboration recently reported the combined results of different searches for invisible Higgs-boson decays, using data collected at 7, 8 and 13 TeV centre-of-mass energies. To find such a rare signal among the overwhelming background produced by SM processes, the study considers events in most Higgs-boson production modes: via vector boson (W or Z) fusion, via gluon fusion and in association with a top quark–antiquark pair or a vector boson. In particular, the analysis looked at hadronically decaying vector bosons or top quark–antiquark pairs. A typical signature for invisible Higgs-boson decays is a large missing energy in the detector, so that the missing transverse energy plays a crucial role in the analysis. No significant signal has been seen, so a new and stricter upper limit is set on the probability that the Higgs boson decays to invisible particles: 15% at 95% confidence level.

This result has been interpreted in the context of Higgs-portal models, which introduce a dark Higgs sector and consider several dark Higgs-boson masses. The extracted upper limits on the spin-independent DM-nucleon scattering cross section, shown in figure 1 for a range of DM mass points, have better sensitivities than those of direct searches over the 1–100 GeV range of DM masses. Once the Run 3 data will be added to the analysis, much stricter limits will be reached or, if we are lucky, evidence for DM production at the LHC will be seen.

Testing flavour symmetry with the Higgs boson

ATLAS figure 1

Lepton number is a quantum number that represents the difference in the number of leptons and antileptons participating in a process, while lepton flavour is a corresponding quantity that accounts for each generation of lepton (e, μ or τ) separately. Lepton number is always conserved but lepton flavour violation (LFV) is known to exist in nature, as this phenomenon has been observed in neutrino oscillations – the transition of a neutral lepton of a given flavour to one with a different flavour. This observation motivates searches for additional manifestations of LFV that may be the result of beyond-the-Standard Model (SM) physics, key among which is the search for LFV decays of the Higgs boson. 

The ATLAS collaboration has recently announced the results of searches for H  eτ and H  μτ decays based on the full Run 2 data set, which was collected at a centre-of-mass energy of 13 TeV. The unstable τ lepton decays to an electron or a muon and two neutrinos, or to one or more hadrons and one neutrino. Most of the background events in these searches arise from SM processes such as Z ττ, the production of top–antitop and weak-boson pairs, as well as from events containing misidentified or non-prompt leptons (fake leptons). These fake leptons originate from secondary decays, for example of charged pions. Several multivariate analysis techniques were used for each final state to provide the maximum separation between signal and background events.

To ensure the robustness of the measurement, two background estimation methods were employed: a Monte Carlo (MC) template method in which the background shapes were extracted from MC and normalised to data, and a “symmetry method”, which used only the data and relied on an approximate symmetry between prompt electrons and prompt muons. Any difference between the branching fractions B(H  eτμ) and B(H  μτe), where the subscripts μ and e represent the decay modes of the τ lepton, would break this symmetry. In both cases, contributions from events containing fake leptons were estimated directly from the data.

The MC-template method enables the measurement of the branching ratios of the LFV decay modes. Searches based on the MC-template method for background estimation involve both leptonic and hadronic decays of τ leptons. A simultaneous measurement of the H  eτ and H  μτ decay modes was performed. For the H  μτ (H  eτ) search, a 2.5 (1.6) standard deviation upward fluctuation above the SM background prediction is observed. The observed (expected) upper limits on the branching fractions B(H  eτ) and B(H  μτ) at 95% confidence level are slightly below 0.2% (0.1%), which are the most stringent limits obtained by the ATLAS experiment on these quantities. The result of the simultaneous measurement of the H  eτ and H  μτ branching fractions is compatible with the SM prediction within 2.2 standard deviations (see figure 1).

The observed upper limits on the branching fractions are the most stringent limits obtained by the ATLAS experiment

The symmetry method is particularly sensitive to the difference in the two LFV decay branching ratios. For this measurement, only the fully leptonic final states were used. Special attention was paid to correctly account for asymmetries induced by the different detector response to electrons and muons, especially regarding the trigger and offline efficiency values for lepton reconstruction, identification and isolation, as well as regarding contributions from fake leptons. The measurement of the branching ratio difference indicates a small but not significant upward deviation for H  μτ compared to H  eτ. The best-fit value for the difference between B(H  μτe) and B(H  eτμ) is (0.25 ± 0.10)%. 

The expected twice-larger LHC Run 3 dataset at the higher centre-of-mass energy of 13.6 TeV will shed further light on these results.

B to D decays reduce uncertainty on γ

LHCb figure 1

The Cabibbo–Kobayashi–Maskawa (CKM) matrix describes the couplings between the quarks and the weak charged current, and contains within it a phase γ that changes sign under consideration of antiquarks rather than quarks. In the Standard Model (SM), this phase is the only known difference in the interactions of matter and anti-matter, a consequence of the breaking of charge-parity (CP) symmetry. While the differences within the SM are known to be far too small to explain the matter-dominated universe, it is still of paramount importance to precisely determine this phase to provide a benchmark against which any contribution from new physics can be compared. 

A new measurement recently presented by the LHCb collaboration uses a novel method to determine γ using decays of the type B± → D[Kπ±π±π]h± (h = π, K). CP violation in such decays is a consequence of the interference between two tree-level processes with a weak phase that differs by γ, and thus provide a theor­etically clean probe of the SM. The new aspect of this measurement compared to those performed previously lies in the partitioning of the five-dimensional phase space of the D-decay into a series of independent regions, or bins. In these bins, the asymmetries between B+ and B meson decay rates can receive large enhancements from the hadronic interactions in the D-meson decay. The enhancement for one of such bins can be seen in figure 1, which shows the invariant mass spectrum of the B+ and B meson candidates, where the correctly reconstructed decays peak at around 5.3 GeV. The observed asymmetry in this region is around 85%, which is the largest difference in the behaviour of matter and antimatter ever measured. Observables from the different bins are combined with information on the hadronic interactions in the D-meson decay from charm-threshold experiments to obtain γ = 55 ± 9°, which is compatible with  previous determinations and is the second most precise single measurement.

The matter–antimatter asymmetry reaches 85% in a certain region, the largest ever observed

The LHCb average value of γ is then determined by combining this analysis with the measurements in many other B and D decays, where in all cases the SM contribution is expected to be dominant. Measurements of charm decays are also included to better constrain both the parameters of charm mixing, which also play an important role in the measurements of B-meson decays at the current level of precision and help to constrain the hadronic interactions in some of the D decays. In particular, included for the first time in this combination is a measurement of yCP, which is proportional to the difference in lifetimes of the two neutral charm mesons, and was determined using two-body decays of the D meson using the entire LHCb data set collected so far. 

The overall impact of these additional analyses reduces the uncertainty on γ by more than 10%, corresponding to adding around a year of data taking across all decay modes. 

The improvements in the knowledge of yCP is also dramatic, reducing the uncertainty by around 40%. While the value of γ is found to be compatible with determinations that would be more susceptible to new physics, the precision of the comparison is starting to approach the level of a few degrees, at which discrepancies may start to be observable. 

Given that the current uncertainties on many of the key input analyses to the combination are predominately statistical in nature, measurements of these fundamental flavour-physics parameters with the upgraded LHCb detector, and beyond, are an intriguing prospect for new-physics searches.

Hidden charm in the quark–gluon plasma

ALICE figure 1

For almost 40 years, charmonium, a bound state of a heavy charm–anticharm pair (hence also called a hidden charm), has provided a unique probe to study the properties of the quark–gluon plasma (QGP), the state of matter composed by deconfined quarks and gluons present in the early instants of the universe and produced experimentally in ultrarelativistic heavy-ion collisions. Charmonia come in a rich variety of states. In a new analysis investigating how these different bound charmonium states are affected by the QGP, the ALICE collaboration has opened a novel way to study the strong interaction at extreme temperatures and densities. 

In the QGP, the production of charmonium is suppressed due to “colour screening” by the large number of quarks and gluons present. The screening, and thus the suppression, increases with the temperature of the QGP and is expected to affect different charmonium states to different degrees. The production of the ψ(2S) state, for example, which is 10 times more weakly bound and two times larger in size than the most tightly bound state, the J/ψ, is expected to be more suppressed. 

This hierarchical suppression is not the only fate of charmonia in the quark–gluon plasma. The large number of charm quarks and antiquarks in the plasma – up to about 100 in head-on lead–lead collisions – also gives rise to a mechanism, called recombination, that forms new charmonia and counters the suppression to a certain extent. This process is expected to depend on the type and momentum of the charmonia, with the more weakly bound charmonia being produced through recombination later in the evolution of the plasma and charmonia with the lowest (transverse) momentum having the highest recombination rate. 

Previous studies, using data first from the Super Proton Synchrotron and then from the LHC, have shown that the production of the ψ(2S) state is indeed more suppressed than that of the J/ψ, and ALICE has also previously provided evidence of the recombination mechanism in J/ψ production. But so far, no studies of ψ(2S) production at low transverse particle momentum had been precise enough to provide conclusive results in this momentum regime, preventing a complete picture of ψ(2S) production from being obtained.

The ALICE collaboration has now reported the first measurements of ψ(2S) production down to zero transverse momentum, based on lead–lead collision data from the LHC collected in 2015 and 2018. The results indicate that the ψ(2S) yield is largely suppressed with respect to a proton–proton baseline, almost a factor of two more suppressed than the J/ψ. The suppression, shown as a function of the collision centrality (Npart) in the figure, is quantified through the nuclear modification factor (RAA), which compares the particle production in lead–lead collisions with respect to the expectations based on proton–proton collisions.  

Theoretical predictions based on a transport approach that includes suppression and recombination of charmonia in the QGP (TAMU) or on the Statistical Hadronisation Model (SHMc), which assumes charmonia to be formed only at hadronisation, describe the J/ψ data, while the ψ(2S) production is underestimated in central events by the SHMc. This observation represents one of the first indications that dynamical effects in the QGP, as taken into account in the transport models, are needed to reproduce the yields of the various charmonium states. It also shows that precision studies, including these and those of other charmonia, and foreseen for Run 3 of the LHC, may lead to a final understanding of the modification of the force binding these states in the extreme environment of the QGP. 

50 Years of Theoretical Physics

Frank Wilczek: 50 Years of Theoretical Physics

This carefully crafted edition highlights the scientific life of 2004 Nobel laureate Frank Anthony Wilczek, and the developments of theoretical physics related to his research. Frank Wilczek: 50 Years of Theoretical Physics is a collection of essays, original research papers and the reminiscences of Wilczek’s friends, students and followers. Wilczek is an exceptional physicist with an extraordinary mathematical talent. The 23 articles represent his vivid research journey from pure particle physics to cosmology, quantum black holes, gravitation, dark matter, applications of field theory to condensed matter physics, quantum mechanics, quantum computing and beyond.

In 1973 Wilczek discovered, together with his doctoral advisor David Gross, asymptotic freedom through which the field theory of the strong interaction, quantum chromodynamics (QCD), was firmly established. Independently that year, the same work was done by David Politzer, and all three shared the Nobel prize in 2004. Wilczek’s major work includes the solution of the strong-CP problem by predicting the hypothetical axion, a result of the spontaneously broken Peccei–Quinn symmetry. In 1982 he predicted the quasiparticle “anyon”, for which evidence was found in a 2D electronic system in 2020. This satisfies the need for a new variant for 2D systems as the properties of fermions and bosons are not transferable. 

Original research papers included in this book were written by pioneering scientists, such as Roman Jackiw and Edward Witten, who are either co-inventors or followers of Wilczek’s work. The articles cover recent developments of QCD, quantum-Hall liquids, gravitational waves, dark energy, superfluidity, the Standard Model, symmetry breaking, quantum time-crystals, quantum gravity and more. Many colour photographs, musical tributes to anyons, memories of quantum-connection workshops and his contribution to the Tsung-Dao Lee Institute in Shanghai complement the volume. The book ends with Wilczek’s publication list, which documents the most significant developments in theoretical particle physics during the past 50 years.

Wilczek is an exceptional physicist with an extraordinary mathematical talent

Though this book is not an easy read in places, and the connections between articles are not always clear, a patient and careful reader will be rewarded. The collection combines rigorous scientific discussions with an admixture of Wilczek’s life, wit, scientific thoughts and teaching – a precious and timely tribute to an exceptional physicist.

bright-rec iop pub iop-science physcis connect