Topics

An extraordinary harvest of new results

The 57th Recontres de Moriond conference on electroweak interactions and unified theories, which took place from 18 to 25 March on the Alpine slopes of La Thuile in Italy, saw over 150 physicists meet in person for week packed with physics. More than 100 talks on the latest experimental results and theoretical ideas  were actively debated, not only during the sessions but also during the breaks and meal times, in a stimulating and congenial atmosphere. The talks covered all the important areas of electroweak physics, with experiment and theory providing complementary approaches to some of the most pressing problems in particle physics and cosmology.

Neutrinos first
Neutrino masses and mixing provide a unique window on the only new physics so far seen beyond the Standard Model. The measured mass differences and mixing parameters provide a consistent picture suggesting the presence of a new scale potentially at approximately 1015 GeV. However, to complete this picture two fundamental elements are missing: the absolute mass scale of neutrinos and the determination, via neutrinoless double beta decay, of whether  neutrinos have a Majorana nature. Also of fundamental importance are the mass-squared ordering of  neutrinos, the maximality (or not) of atmospheric mixing, and the measurement of leptonic CP violation. All these questions were addressed by a range of new experimental results, many of which were presented for the first time.

NOvA and T2K presented a very consistent picture of the PMNS framework with a slight preference of the normal over the inverted ordering

The KATRIN collaboration reported an absolute upper limit on the electron-neutrino mass of 800 meV and is expected to reach a limit of 200 meV eventually. With a detailed analysis of their tritium decay spectrum, the team was also able to exclude rapid oscillations of electron neutrinos with potential sterile neutrinos and to set a limit on cosmic-neutrino local over-densities. The KamLandZEN, CUPID-Mo and Majorana Demonstrator experiments showed first results on neutrinoless double-beta decay searches in different systems.  KamLandZEN  had the largest number of radionuclei, providing upper limits on the effective electron neutrino mass between 36 and 156 meV (depending on model assumptions) and is expected to reach 20 meV with more data. CUPID-Mo and Majorana Demonstrator experiments are expected to eventually reach stronger limits down to approximately 10 meV. The latter experiment, based on germanium detectors, also reported interesting bounds on models for wave-function collapse.

Moriond_2023_young_researchers

The long-baseline νμ oscillation experiments NOvA and T2K presented analyses of their latest intermediate dataset, showing a very consistent picture of the PMNS framework with a slight preference (at the one or two standard-deviation level) of the normal over the inverted ordering and the upper over the lower octant for θ23. Both experiments are sensitive to electron-neutrino appearance. NOvA, however, provided the first evidence for electron anti-neutrino appearance and a first long-baseline measurement of sin2θ23, in very good agreement with the reactor neutrino data. Both experiments exclude CP conserving values of δCP of 0 or π at 90% confidence. IceCUBE with its DeepCORE extension also presented stunning atmospheric neutrino-oscillation results comparable with SuperKamiokande and long-baseline experiment sensitivities. All these experiments provide strong supporting evidence of the validity of the three neutrino-flavour paradigm.

Longstanding neutrino anomalies were discussed in detail. The reactor-neutrino deficit interpretation in terms of the existence of a sterile neutrino species is incompatible with several short baseline data. The significance of the LSND and MiniBooNE short-baseline low-energy excess was revisited in the light of new backgrounds. The long-standing gallium anomaly was further verified and confirmed by the independent experiment BEST. The BEST observations are, however, also not compatible with a simple sterile-neutrino oscillation pattern. The PROSPECT reactor-neutrino experiment also showed first results excluding the gallium anomaly in terms of an oscillation with a sterile neutrino. Finally, a peaking anomaly, in the range 5-7 MeV, was observed by several experiments (including RENO, DayaBay, NEOS, Chooz and PROSPECT). This anomaly cannot be easily interpreted in terms of fundamental neutrino physics. Instead, nuclear models have been discussed in detail and should be looked at carefully.

Finally, the results of CONUS, a Coherent neutrino scattering experiment based on high precision germanium detectors,  set limits on light vector mediators and the neutrino magnetic moment.

The three-neutrino paradigm is standing tall with some anomalies that  need to be further clarified, in particular the BEST gallium anomaly.

On the theoretical side, it was shown that leptogenesis is possible for any right-handed neutrino masses above about 0.1 GeV, which, if light enough, can be probed by the proposed SHiP experiment at CERN, as well as FCC-ee and HL-LHC. Neutrino experiments such as COHERENT were analysed in the framework of Standard Model Effective Field Theory.

The IceCUBE experiment also showed splendid multi-messenger results from high- and ultrahigh-energy neutrino observations and pointed out their ability to probe the Standard Model with ultrahigh-energy neutrinos that have travelled cosmic distances. These neutrinos are expected to be even mixtures of the three neutrino species; any deviation would be a clear sign of new physics. The cosmic-neutrino data also highlighted the missing data in neutrino-nucleon interactions in the range of a few 100 GeV to 10 TeV. At this year’s Moriond conference, the birth of collider neutrino physics was also presented, with the first results from the FASERν and SND experiments. FASERν showed the first unambiguous observation of neutrinos from proton-proton collisions at LHC point 1.

Overall the three neutrino paradigm is standing tall with some anomalies that still need to be further clarified, in particular the BEST gallium anomaly.

From neutrinos to quarks

From a theoretical point of view, neutrino and heavy-quark physics are two sides of the same coin: they provide information related to the flavour problem, namely the unexplained origin of quark and lepton families, masses and mixings. The fact that in the Standard Model fermion mass hierarchies arise from Yukawa couplings does not make it more satisfactory. The recently observed anomalies in semi-leptonic B decays exhibiting unexpected lepton-flavour patterns have raised numerous speculations and have in particular suggested that the flavour scale might be right around the corner at the TeV scale, motivating models discussed at the conference involving a new Z’ gauge boson or a scalar or a vector leptoquark from a twin Pati-Salam theory of flavour.

However, the recent results from LHCb on the main anomalies have shed new light on the question. LHCb discussed their recent reanalysis of the R(K) and R(K*) ratio of decay rates of B→K(*)μμ /ee with the inclusion of an additional background from misidentified electrons are now in excellent agreement with the Standard Model. LHCb also presented a new result on the measurement of the R(D*) ratio of decay rates including fully hadronic τ decays and a new combined measurement of the R(D) and R(D*) ratios. With these new measurements from LHCb the R(D*) ratio agrees with the Standard Model predictions. A tension at the 3 standard deviations level is still observed, mostly due to the R(D) ratio.

Alternatively, D-meson decays were extensively discussed as a promising new playground for discovering new physics due to the richness of new data available, and the efficiency of the GIM mechanism for the charm quark and SU(3) flavour symmetry leading to easily verifiable null tests of the Standard Model.

Results of various rare decay and new resonance searches were presented by LHC experiments, with for example the ambitious searches of the extremely rare decay mode of the D meson in two muons, the observation by the CMS experiment of the decay of the η meson to four muons and the search for  states decaying to di-charmonium states as J/ψ/J/ψ or J/ψ/ψ2S to four muons, which could correspond to four charm tetra-quark states.

Leaving no stone unturned, the LHC experiments have presented a whole host of new results of searches for new phenomena beyond the Standard Model

A highlight of the conference was the strong contribution from the Belle II experiment in all areas of heavy flavour physics, including: several measurements of b→s transitions, including a fully inclusive measurement; several time dependent CP-violation observables, which yield precisions on the CKM parameter sin(2β) on a par with the current world’s best measurements in those channels; as well as new input to the |Vub| and |Vcb| puzzle (the tension between exclusive and inclusive measurements which suffer from different theoretical uncertainties), with an exclusive measurement in the golden B→πlν mode and an inclusive measurement of the B→D*lν decay.

Moriond_ew_2023_theory_experiment

LHCb presented nice new results in the bsss transition in the φφ channel showing that no CP- violating effect is seen, with results separated in different polarisation modes. LHCb also presented a new measurement of the CKM angle γ in the B±→D[Kπ±ππ±]h± (h = π, K) channel and an overall combination yielding a precision of approximately 3.7º.

Finally, a status report was given by the KOTO experiment which is searching for the extremely rare KLπνν process. The two first runs (starting in 2015 until 2018) have allowed the collaboration to identify two new backgrounds and provide methods to mitigate them since 2019.  With these improvements the KOTO experiment should reach sensitivities at the 10-10 level, close to the expected branching fraction in the Standard Model of 3×10-11. All measurements shown so far are compatible with the CKM paradigm.

Also in the quark sector, the latest measurements and the prospects in measurements of the neutron electric dipole moment were presented, providing strong constraints on new physics scenarios at high energy scales.

Lattice-QCD studies have made remarkable progress in recent years, with hadronic contributions to  muon g-2 being more or less under control, more so in the case of light-by-light contributions, which agree well with other results, and less so regarding the hadronic vacuum polarisation with  errors being driven down by the BMW collaboration, which by itself seems to lead to more consistency with the FNAL and BNL results. However, the BMW results are not yet fully confirmed either by other lattice groups or the R-ratio from experiment, with the recent VEPP data being out of line with previous experiments.

Higher precision from lattice calculations has also led to the so-called Cabibbo anomaly reported at Moriond, whereby the unitarity of the first row of the CKM matrix seems to be violated by 2.7σ. If confirmed by future experiments and lattice calculations, this could be a signal for new physics.

In addition, in the lepton flavour sector Belle II presented their first and already the world’s most precise tau-mass measurement, which agrees with previous measurements. With only approximately half the luminosity accumulated by the Belle experiment, Belle II presented measurements surpassing the Belle precision, thus displaying the excellent performance of the experiment.

Dark searches

A variety of dark-matter candidates were discussed including: primordial black hole with improved limits using 21 cm hydrogen astronomy; weakly interacting massive particles (WIMPs) from new electroweak fermion multiplets with heavier masses; heavy singlet dilaton-like scalars; keV neutrinos from an inverse seesaw model; axions or axion-like particles with an extended window of masses arising from non-standard cosmology; and ultralight dark matter such as dark photons whose interactions with the detector could be simulated by the software package DarkELF. An interesting proposal for axion detectors that can double up as high-frequency gravitational wave detectors was also discussed.

A flurry of results of searches for dark-sector particles at the LHC, Belle II, Babar, NA62, BES and PADME were shown.

The XENONnT collaboration presented new results, unblinded for the occasion, with an exposure of 95.1 days corresponding to 1.1 tonne-year. LZ also presented their latest results with a similar exposure. The two experiments, along with the PandaX xenon-based experiment, are now exploring new territory at low WIMP-nucleon cross sections.

These very low cross sections motivate further searches for the existence of a dark sector with dark photons or axion-like particles. A flurry of results of searches for dark-sector particles at the LHC, Belle II, Babar, NA62, BES and the PADME experiment were shown. PADME, a fixed-target e+e experiment, also presented their ability to directly probe an anomaly which was also seen in 12C and 4He.

Theories of new heavy particles were also discussed, ranging from an analysis of the minimal supersymmetric Standard Model which showed that gluinos of 1 TeV and stop squarks of 500 GeV could still have escaped detection, to theories of two Higgs-doublet models plus a Higgs singlet, which might be responsible for the 95 GeV diphoton events, to the observation that vector-like fermions (which come in opposite chirality pairs) have the right properties to avoid a metastable universe.

Electroweak searches at the LHC

The LHC experiments presented results from a host of searches for new phenomena beyond the Standard Model, leaving no stone unturned. These looked for signatures of models motivated by theories addressing the shortcomings of the Standard Model, astrophysical and cosmological observations such as dark matter that could be interpreted as the existence of a fundamental field, and experimental anomalies observed such as in the lepton-flavour or muon g-2 anomalies. These searches place very important limits on the presence of new phenomena up to the few-TeV scale. With 20 times more data, the High-Luminosity LHC (HL-LHC) will provide invaluable opportunities to significantly increase the search domain and bring potential for discoveries.

The LHC experiments also presented a series of new results based on W and Z production, coinciding very well with the 40th anniversary of the W and Z boson discoveries at the CERN SppS. The CMS collaboration showed a measurement of the τ polarisation. This measurement can be directly translated in terms of a measurement of the weak mixing angle with a precision of approximately 10%, which is close to the precision reached by e+e experiments. The CMS collaboration also presented a measurement of the invisible width of the Z boson that is more precise than the direct invisible-width measurements performed at LEP. ATLAS showed the precise measurement of the Z boson transverse momentum differential cross section integrated over the full phase space of leptons produced in the Z decay, and with it was able to provide the current most precise measurement of αS with a precision comparable to the current world average or estimates using lattice QCD. ATLAS also presented a new measurement of the W-boson mass using a re-analysis of 7 TeV data collected in 2011, yielding a value slightly lower (by 10 MeV) and with a precision improved to 16 MeV, thus increasing the experimental tension with the recently published CDF measurement.

The LHC results have already obtained precision and sensitivity to processes that were thought to be unreachable prior to the start of operations.

ATLAS and CMS also showed results for more complex and rare processes equally highlighting the remarkable progresses made at the precision frontier. Both experiments showed an observation of the four top quarks production process and ATLAS presented the observation of two new tri-boson production processes, WZγ and Wγγ. ATLAS also presented a new measurement of the associated production of a W boson in association with a pair of top quarks which is a key background to numerous very important processes, as for instant the associated production of a Higgs boson with a pair of top quarks.

The results presented at this year’s Moriond elctroweak session show how LHC results have already obtained precision and sensitivity to processes that were thought to be unreachable prior to the start of operations. An outstanding example discussed in detail was the progress made in the search for di-Higgs production by ATLAS and CMS, a cornerstone of the HL-LHC physics programme to constrain the Higgs boson trilinear self-coupling. These results showed that combined, experiments should reach the sensitivity for the observation of this process at the LHC. Another example which was also discussed is the race to reach sensitivity to the Higgs-boson decays to charm quarks, where new methods based on deep learning techniques are making significant progress.

To further improve on the expected precision reach at the HL-LHC, intermediate goals at Run 3 are extremely important. Both ATLAS and CMS presented new results on measurements of Z boson, top, and Higgs boson production with LHC Run 3 data taken in 2022.

This year’s Moriond conference showed an extraordinary harvest of new results, giving an opportunity to take stock on the open questions and see the remarkable progress made since last year.

LHCb sees evidence for a new tetraquark state

LHCb figure 1

Half a century since its inception, quantum chromodynamics (QCD) continues to prove itself as the correct description of the strong interaction between quarks and gluons. At low energies, however, perturbative calculations in QCD are not possible. Therefore, understanding the properties of hadrons usually requires the development of phenomenological models.

The study of exotic hadrons made up of more than three quarks offers a powerful way to gain a deeper understanding of the non-perturbative behaviour of QCD. The LHC has so far discovered no fewer than 23 new exotic hadrons, most of which were first observed by the LHCb experiment. In March 2021 the LHCb collaboration reported the observation of two tetraquarks with the c c u s quark content – named Tθψs1(4000)+ and Tθψs1(4220)+ – in the decay B+ J/ψφK+. Now, based on a study of the isospin-symmetry-related decay B0 J/ψφK0S using a sample of about 2000 candidate events, the collaboration has found evidence for a new tetraquark state, Tθψs1(4000)0, with a minimal quark content c c d s . The name of the new state follows a convention introduced by LHCb in 2022 to help simplify the exotic- hadron vista.

The Tθψs1(4000)0 state was found as a resonance in the J/ψK0S mass spectrum through an amplitude analysis, and is characterised as a horizontal band in the Dalitz plot (figure 1). Imposing isospin symmetry for all intermediate states except for the Tθψs1(4000)+/0 in the two B-meson decays, the signal significance is measured to be 4.0σ. The mass and width are found to be equal to those of the Tθψs1(4000)+ within uncertainties, which is consistent with the new state being an isospin partner of the Tθψs1(4000)0. If isospin symmetry between the two states is further applied, the Tθψs1(4000)0 significance increases to 5.4σ.

The Tθψs1(4000)+ and Tθψs1(4000)0 states are not the only pair of isospin partners of hidden-charm tetraquark candidates with strangeness. Recently the BESIII collaboration reported signals of the Tψs(3985)+ and Tψs(3985)0 states with a minimal quark content of c c u s and c c d s , respectively. Although the tetraquark candidates seen by BESIII and LHCb have similar masses, the natural widths measured by each experiment are significantly different, indicating that they are distinct states.

Further studies and theoretical inputs are needed to determine the inner structure of such hidden-charm tetraquark candidates, for example whether they are compact tetraquarks, hadron molecules or produced due to kinematic effects. Despite continuous efforts, the detailed mechanisms responsible for binding multi-quark states have remained mysterious. With the start of LHC Run 3 and a new upgraded detector, the LHCb collaboration can look forward to finding further exotic states that shed light on the low-energy behaviour of QCD relevant to hadronic matter.

Strong coupling probed beyond the TeV scale

ATLAS figure 1

Due to asymptotic freedom, the theory of quantum chromodynamics (QCD), which describes the strong interaction acting on quarks and gluons, becomes weaker at short-distance (high-energy) scales. The value of the strong coupling constant αs at different scales can be determined using experimental observables that characterise the geometrical distribution of the outgoing hadrons from particle collisions. Among such observables are energy–energy correlations, which are obtained by computing the angular difference between all the possible pairs of final-state particles, weighted by the product of the normalised energies of the two particles involved. 

Precision tests

Energy–energy correlations had a significant impact on early precision tests of QCD, such as those performed at the PETRA e+e collider at DESY, where the gluon was discovered, and on the determination of αs. At hadron colliders such as the LHC, where the longitudinal momentum of the colliding partons is unknown, longitudinally invariant quantities are defined instead. The transverse energy–energy correlation (TEEC) function is defined as the transverse energy-weighted azimuthal-angular distribution of jet pairs, covering the full cos(φ) range, where φ is the azimuthal angular difference between the two jets. The TEEC distribution peaks at the edges of the cos(φ) range due to self-correlations, i.e. the angle between a jet and itself, and due to di-jet back-to-back configurations arising from momentum conservation between both jets in the transverse plane. Moreover, due to additional gluon radiation, a central plateau whose height is sensitive to the value of αs arises.

ATLAS has recently measured the TEEC distributions in multi-jet events and used them to determine αs. The analysis is performed using the full data sample (139 fb–1) of proton–proton collisions at a centre-of-mass energy of 13 TeV recorded during LHC Run 2. The measurements are presented in bins of the scalar sum of the transverse momenta of the two leading jets in the collision event (HT2), and are corrected for detector effects. By fitting theoretical calculations to the experimental data, the value of αs is determined in different kinematic regions, thereby testing the running of the strong coupling strength at high energy scales.

The results are 30% more precise than previous ATLAS measurements at 7 and 8 TeV, with a total systematic uncertainty of the order of 2.5% for the TEEC functions. The level of precision used for the theoretical predictions is equally important for the extraction of αs as that of the data. Finite predictions as a function of the value of αs(mZ), where mZ is the Z-boson mass, are calculated at next-to-next-to-leading order (NNLO) in perturbative QCD for three-jet configurations and corrected for non-perturbative effects, reducing the theoretical uncertainties in the central plateau from 6% at next-to-leading order (NLO) down to 2% for the NNLO predictions.

ATLAS figure 2

The ratio of data to theoretical prediction for the TEEC functions in the various HT2 bins is shown in figure 1. The overall description of the shape at NNLO is found to be in agreement with the data, thus confirming our understanding of strong interactions over a large range of momentum transfers. Values of αs are determined from individual fits of the theoretical predictions to data in each HT2 bin. In addition, a global fit to the measured TEEC distributions results in αs(mZ) = 0.1175 ± 0.0001 (stat.) ± 0.0006 (syst.) +0.0034 –0.0017 (theo.), where the theoretical uncertainty is dominated by the scale uncertainties that estimate the contribution of higher-order corrections. All extracted values of αs agree with the recent world average. 

This result represents the first αs determination with NNLO accuracy in three-jet production. Figure 2 compares the fitted values of αs with previous determinations from ATLAS and other experiments as a function of the momentum scale. The measurements clearly exhibit asymptotic freedom, and a significant improvement in precision compared to previous measurements.

Multi-strange production constrains hadronisation

ALICE figure 1

One of the fundamental questions in quantum chromodynamics is how hadrons are produced in high-energy collisions. More specifically: what determines the relative production rates of baryons, which contain three valence quarks, and mesons, which consist of a quark and an antiquark?

In electron–positron, electron–proton and proton–proton collisions, where a small number of particles is produced, the hadronisation process is expected to be universal: it does not depend on the type of colliding particles, but only on the parent quark or gluon. It has been found, however, that in proton–proton and proton–lead collisions at LHC energies, the baryon-to-meson ratios p/π, Λ/KS0 and Λc/D0 are significantly larger than in e+e and ep collisions. This enhancement, at intermediate transverse momentum pT (1–5 GeV), in small systems is qualitatively similar to that observed in heavy-ion collisions, e.g. between lead ions. However, in heavy-ion collisions this behaviour has been related to the interplay between hadronisation and the creation and expansion of a hot and dense quark–gluon plasma (QGP) via so-called quark recombination involving partons from within the QGP.

To shed light on hadron-production mechanisms in collisions at the LHC, ALICE has performed a novel study of the production of strange and multi-strange hadrons inside and outside energetic jets in proton–proton and proton–lead collisions. The method separates particles produced in association with a hard scattering process (within jets) from those produced in soft processes with low momentum transfer that dominate the underlying event.

The results show that the strange baryon-to-meson Λ/KS0 ratio enhancement seen in the inclusive measurement is absent within the jet cone and restricted to soft-particle production processes outside the jet cone (figure 1, left). On the other hand, the multi- strange to single-strange hyperon yield ratio Ω±/Λ (figure 1, right) shows a similar pT dependence in the interval 2 < pT < 5 GeV inside and outside the jet cone, while different behaviour is observed for the Ξ±/Λ ratio (not shown). This suggests that the baryon production mechanism also depends on the strangeness content of the baryons; the production of hyperons in jets becomes similar to that in the underlying event for baryons with large strangeness content.

These measurements will help to improve the modelling of particle production at the LHC

These measurements provide new input to understand the relative contribution of soft and hard processes to multi-strange hadron production, and thus will help to improve the modelling of particle production at the LHC. To illustrate how, two calculations with different hadronisation models are shown in the figure: one model includes string formation beyond leading colour (CR-BLC, blue band), while the other includes colour ropes (red dashed line). While one of the models describes the Λ/KS0 ratio inside jets, the other shows a better agreement with the Ω±/Λ ratio.

ALICE has also investigated the multiplicity dependence of strange baryon-to-meson and baryon-to-baryon ratios in jets in proton–lead collisions and compared it with those in proton–proton collisions. Within the current experimental precision, no difference between the two collision systems, nor a dependence on the event multiplicity, is observed. These studies will further benefit from the increased precision that will be achieved with the substantially larger data samples from ongoing LHC runs.

A celebration of physics in the Balkans

The 11th General Conference of the Balkan Physical Union (BPU11 Congress) took place from 28 August to 1 September 2022 in Belgrade, with the Serbian Academy of Science and Arts as the main host. Initiated in 1991 in Thessaloniki, Greece, and open to participants globally, the series provides a platform for reviewing, disseminating and discussing novel research results in physics and related fields. 

The scientific scope of BPU11 covered the full landscape of physics via 139 lectures (12 plenary and 23 invited) and 150 poster presentations. A novel addition was five roundtables dedicated to high-energy physics (HEP), widening participation, careers in physics, quantum and new technologies, and models of studying physics in European universities with a focus on Balkan countries. The hybrid event attracted about 476 participants (325 on site) from 31 countries, 159 of whom were students, and demonstrated the high level of research conducted in the Balkan states.

Roadmaps to the future

The first roundtable “HEP – roadmaps to the future” showed the strong collaboration between CERN and the Balkan states. Four out of 23 CERN Member States come from the region (Bulgaria, Greece, Serbia and Romania); two out of three Associate Member States in the pre-stage to membership are Cyprus and Slovenia; and two out of seven Associate Member States are Croatia and Turkey. A further four countries have cooperation agreements with CERN, and more than 400 CERN users come from the Balkans. 

Kicking off the HEP roundtable discussions, CERN director for research and computing Joachim Mnich presented the recently launched accelerator and detector R&D roadmaps in Europe. Paris Sphicas (CERN and the University of Athens) reported on the future of particle-physics research, during which he underlined the current challenges and opportunities. These included: dark matter (for example the search for WIMPs in the thermal parameter region, the need to check simplified models such as axial-vector and di-lepton resonances, and indirect searches); supersymmetry (the search for “holes” in the low-mass region that will exist even after the LHC); neutrinos (whether neutrinos are Majorana or Dirac particles, their mass measurement and exploration of a possible “sterile” sector); as well as a comprehensive review of the Higgs sector. 

CERN’s Emmanuel Tsesmelis, who was awarded the Balkan Physical Union charter and honorary membership in recognition of his contributions to cooperation between the Balkan states and CERN, reflected on the proposed Future Circular Collider (FCC). Describing the status of the FCC feasibility study, due to be completed by the end of 2025, he stressed that the success of the project relies on strong global participation. His presentation initiated a substantial discussion about the role of the Balkan countries, which will be continued in May 2023 at the 11th LHCP conference in Belgrade.

The roundtable devoted to quantum technologies (QTs), chaired by Enrique Sanchez of the European Physical Society (EPS), was another highlight with strong relevance to HEP. Various perspectives on the different QT sectors – computing and simulation, communication, metrology and sensing – were discussed, touching upon the impact they could have on society at large. Europe plays a leading role in quantum research, concluded the panel. However, despite increased interest in QTs, including at CERN, issues such as how to obtain appropriate funding to enhance European technological leadership, remain. Discussions highlighted the opportunities for new generations of physicists from the Balkans to help build this “second quantum revolution”. 

In addition to the roundtables, four high-level scientific satellite events took place, attracting a further 150 on-site participants: the COST Workshop on Theoretical Aspects of Quantum Gravity; the SEENET–MTP Assessment Meeting and Workshop; the COST School on Quantum Gravity Phenomenology in the Multi-Messenger Approach; and the CERN–SEENET–MTP–ICTP PhD School on Gravitation, Cosmology and Astroparticle Physics. The latter is part of a unique regional programme in HEP initiated by SEENET–MTP (Southeastern European Network in Mathematical and Theoretical Physics) and CERN in 2015, and joined by the ICTP in 2018, which has contributed to the training of more than 200 students in 12 SEENET countries. 

The BPU11 Congress, the largest event of its type in the region since the beginning of the COVID-19 pandemic, contributed to closer cooperation between the Balkan countries and CERN, ICTP, SISSA, the Central European Initiative and others. It was possible thanks to the support of the EPS, ICTP and CEI-Trieste, CERN, EPJ, as well as the Serbian ministry of science and institutions active in physics and mathematics in Serbia. In addition to the BPU11 PoS Proceedings, several articles based on invited lectures will be published in a focus issue of EPJ Plus “On Physics in the Balkans: Perspectives and Challenges”, as well as in a special issue of IJMPA.

Unconventional music @ CERN

Honouring the 100th anniversary of Einstein’s Nobel prize, the Swedish embassy in Bern collaborated with CERN for an event connecting science and music, held at the CERN Globe of Science and Innovation on 19 October. The event was originally planned for 2021 but was postponed due to the pandemic.

Brian Foster (University of Oxford) talked about Einstein’s love for music and playing the violin, which was underlined with many photos showing Einstein with some of the well-known violinists of the time. Around the period Einstein was awarded the Nobel prize, Russian engineer Lev Termen invented the theremin, consisting of two antennae and played without physical contact. This caught Einstein’s attention and it is said that he even played the theremin himself once.

Delving further into the unconventional, LHC physicists performed Domenico Vicinanza’s (GEANT and Anglia Ruskin University) “Sonification of the LHC”, for which the physicist-turned composer mapped data recorded by the LHC experiments between 2010 and 2013 into music. First performed in 2014 on the occasion of CERN’s 60th anniversary, Vicinanza’s piece is intended as a metaphor for scientific cooperation, in which different voices and perspectives can reach the same goal only by playing together.

There followed the debut of an even more unconventional piece of music by The Stone Martens – a Swiss and Swedish “noise collaboration” improvised by Henrik Rylander and Roland Bucher. By sending the output of his theremin through guitar-effects pedals, Rylander created a unique sound. Together with Bucher’s self-made “noise table”, with which he sampled acoustic instruments and everyday objects, the duo created a captivating, otherworldly sound collage that was well received by the 160-strong audience. The event closed with an unconventional Bach concerto for two violins in which these unique sounds were fused with traditional instruments. Anyone interested in experiencing the music for themselves can find a recorded version at https://indico.cern.ch/event/1199556/.

A powerful eye opener into the world of AI

The appearance of the word “for” rather than “in” in the title of this collection raises the bar from an academic description to a primer. It is neither the book’s length (more than 800 pages), nor the fact that the author list resembles a who’s who in artificial intelligence (AI) research carried out in high-energy physics that makes this book live up to its premise; it is the careful crafting of its content and structure.

Artificial intelligence is not new to our field. On the contrary, some of the concepts and algorithms have been pioneered in high-energy physics. Artificial Intelligence for High Energy Physics credits this as well as reaching into very recent AI research. It covers topics ranging from unsupervised machine-learning techniques in clustering to workhorse tools such as boosted decision trees in analyses, and from recent applications of AI in event reconstruction to simulations at the boundary where AI can help us to understand physics.

Each chapter follows a similar structure: after setting the broader context, a short theoretical introduction into the tools (and, where possible, the available software) is given, which is then applied and adapted to a high-energy physics problem. The ratio of in-depth theoretical background to AI concepts and the focus on applications is well balanced, and underlines the work of the editors, who avoided duplication and cross-reference individual chapters and topics. The editors and authors have not only created a selection of high-quality review articles, but a coherent and remarkably good read. Takeaway messages in the chapter for distributed training and optimisation stand out, and one might wish that this concept found more resonance throughout the book.

Artificial Intelligence for High Energy Physics

Sometimes, the book can be used as a glossary, which helps to bridge the gaps that seem to exist simply because high-energy physicists and data scientists use different names for similar or even identical things. While the book can certainly be used as a guide for a physicist in AI, an AI researcher with the necessary physics knowledge may not be served quite so well.

In an ideal world, each chapter would have a reference dataset to allow the reader to follow the stated problems and learn through building and exercising the described pipelines. This, however, would turn the book from a primer into a textbook for AI in high-energy physics. To be fair, wherever possible the authors of the chapters have used and referred to publicly available datasets, and one chapter is devoted to the issue of arranging  a community data competition, such as the TrackML challenge in 2018.

As for the most important question – have I learned something new? – the answer is a resounding “yes”. While none of the broad topics and their application to high-energy physics will come as a surprise to those who have been following the field in recent years, there are neat projects and detailed applications showcased in this book. Furthermore, reading about a familiar topic in someone else’s words can be a powerful eye opener.

Innovation on show for future ep/eA colliders

Following the publication of an updated conceptual design report in 2021, CERN continues to support studies for the proposed electron–hadron colliders LHeC and FCC-eh as potential options for the future, and to provide input to the next update of the European strategy for particle physics, with emphasis on FCC. LHeC would require the LHC to be modified, while FCC-eh is a possible operational mode of the proposed Future Circular Collider at CERN. A key factor in studies for a possible future “ep/eA” collider is power consumption, for which researchers around the world are exploring the use of energy recovery linacs at the high-energy frontier.

The ep/eA programme finds itself at a crossroad between nuclear and particle physics, with synergies with astroparticle physics. It has the potential to empower the High-Luminosity LHC (HL-LHC) physics programme in a unique way, and allows for a deeper exploration of the electroweak and strong sectors of the Standard Model beyond what can be achieved with proton–proton collisions alone. In many cases, adding LHeC to HL-LHC data can significantly improve the precision of Higgs-boson measurements – similar to the improvements expected when moving from the LHC to HL-LHC.

The innovative spirit of the ep/eA community was demonstrated during the workshop “Electrons for the LHC – LHeC/FCCeh and PERLE” held at IJCLab from 26 to 28 October. As the ep/eA community moves from the former HERA facility and from the Electron-Ion Collider, currently under construction at Brookhaven, to higher energies at LHeC and FCC-eh, the threshold will be reached to study electroweak, top and Higgs physics in deep-inelastic scattering (DIS) processes for the first time. In addition, these programmes enable the exploration of the low Bjorken-x frontier orders of magnitude beyond current DIS results. At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces. In recent years, particle physicists have learned that the ultimate precision for Higgs-boson physics lies in the complementarity of e+e, pp and ep collisions, as embedded in the FCC programme, for example. Exploiting this complementarity is key to exploring new territories via the Higgs and dark sectors, as well as zooming in on potential anomalies in our data.

The October workshop underlined the advantage of a joint ep/pp/eA/AA/pA interaction experiment, and the need to further document its added scientific value. For example, a precision of 1 MeV on the W-boson mass could be within reach. In short, the ep data allows constraints to be placed on the most important systematic uncertainty when measuring the W-boson mass with pp data.

Reduced power 

Participants also addressed how to reduce the power consumption of LHeC and FCC-eh. PERLE, an expanding international collaboration revolving around a multi-turn demonstrator facility being pursued at IJCLab in Orsay for energy recovery linacs (ERLs) at high beam currents, is ready to become Europe’s leading centre for developing and testing sustainable accelerating systems.  

At this stage, it is unclear what physics will be unlocked if hadronic matter is broken into even smaller pieces

As demonstrated at the workshop, with additional R&D on ERLs and ep colliders we might be able to further reduce the power consumption of the LHeC (and FCC-eh) to as low as 50 MW. These values are to be compared with the GW power consumption if there was no energy recovery and therefore provide a power-economic avenue to extend the Higgs precision frontier beyond the HL-LHC. ERLs are not uniquely applicable to eA colliders, but have been discussed for future linear and circular e+e colliders too. With PERLE and other sustainable accelerating systems, the ep/eA programme has the ambition to deliver a demonstration of ERL technology at high beam current, potentially towards options for an ERL-based Higgs factory.

Workshop participants are engaged to further develop an ep/eA programme with the ability to significantly enrich this overall strategy with a view to finding cracks in the Standard Model and/or finding new phenomena that further our understanding of nature at the smallest and largest scales.

Lost in the landscape

What is string theory?

I take a view that a lot of my colleagues will not be too happy with. String theory is a very precise mathematical structure, so precise that many mathematicians have won Fields medals by making contributions that were string-theory motivated. It’s supersymmetric. It exists in flat or anti-de Sitter space (that is, a space–time with a negative curvature in the absence of matter or energy). And although we may not understand it fully at present, there does appear to be an exact mathematical structure there. I call that string theory with a capital “S”, and I can tell you with 100% confidence that we don’t live in that world. And then there’s string theory with a small “s” – you might call it string-inspired theory, or think of it as expanding the boundaries of this very precise theory in ways that we don’t know how to at present. We don’t know with any precision how to expand the boundaries into non-supersymmetric string theory or de Sitter space, for example, so we make guesses. The string landscape is one such guess. It’s not based on absolutely precise capital-S string theory, but on some conjectures about what this expanded small-s string theory might be. I guess my prejudice is that some expanded version of string theory is probably the right theory to describe particle physics. But it’s an expanded version, it’s not supersymmetric. Everything we do in anti-de-Sitter-space string theory is based on the assumption of absolute perfect supersymmetry. Without that, the models we investigate are rather speculative. 

How has the lack of supersymmetric discoveries at the LHC impacted your thinking?

All of the string theories we know about with any precision are exactly supersymmetric. So if supersymmetry is broken at the weak scale or beyond, it doesn’t help because we’re still facing a world that is not exactly supersymmetric. This only gets worse as we find out that supersymmetry doesn’t seem to even govern the world at the weak scale. It doesn’t even seem to govern it at the TeV scale. But that, I think, is secondary. The first primary fact is that the world is not exactly supersymmetric and string theory with a capital S is. So where are we? Who knows! But it’s exciting to be in a situation where there is confusion. Anything that can be said about how string theory can be precisely expanded beyond the supersymmetric bounds would be very interesting. 

What led you to coin the string theory “landscape” in 2003? 

A variety of things, among them the work of other people, in particular Polchinski and Bousso, who conjectured that string theories have a huge number of solutions and possible behaviours. This was a consequence, later articulated in a 2003 paper abbreviated “KKLT” after its authors, of the innumerable (initial estimates put it at more than 10500) different ways the additional dimensions of string theory can be hidden or “compactified”. Each solution has different properties, coupling constants, particle spectra and so forth. And they describe different kinds of universes. This was something of a shock and a surprise; not that string theory has many solutions, but that the numbers of these possibilities could be so enormous, and that among those possibilities were worlds with parameters, in particular the cosmological constant, which formed a discretuum as opposed to a continuum. From one point of view that’s troubling because some of us, me less than others, had hoped there was some kind of uniqueness to the solutions of string theory. Maybe there was a small number of solutions and among them we would find the world that we live in, but instead we found this huge number of possibilities in which almost anything could be found. On the other hand, we knew that the parameters of our world are unusual, exceptional, fine-tuned – not generic, but very special. And if the string landscape could say that there would be solutions containing the peculiar numbers that we face in physics, that was interesting. Another motivation came from cosmology: we knew on the basis of cosmic-microwave-background experiments and other things that the portion of the universe we see is very flat, implying that it is only a small part of the total. Together with the peculiar fine-tunings of the numbers in physics, it all fitted a pattern: the spectrum of possibilities would not only be large, but the spectrum of things we could find in the much bigger universe that would be implied by inflation and the flatness of the universe might just include all of these various possibilities. 

So that’s how anthropic reasoning entered the picture?

All this fits together well with the anthropic principle – the idea that the patterns of coupling constants and particle spectra were conditioned on our own existence. Weinberg was very influential in putting forward the idea that the anthropic principle might explain a lot of things. But at that time, and probably still now, many people hated the idea. It’s a speculation or conjecture that the world works this way. The one thing I learned over the course of my career is not to underestimate the potential for surprises. Surprises will happen, patterns that look like they fit together so nicely turn out to be just an illusion. This could happen here, but at the moment I would say the best explanation for the patterns we see in cosmology and particle physics is a very diverse landscape of possibilities and an extremely large universe – a multiverse, if you like – that somehow manifests all of these possibilities in different places. Is it possible that it’s wrong? Oh yes! We might just discover that this very logical, compelling set of arguments is not technically right and we have to go in some other direction. Witten, who had negative thoughts about the anthropic idea, eventually gave up and accepted that it seems to be the best possibility. And I think that’s probably true for a lot of other people. But it can’t have the ultimate influence that a real theory with quantitative predictions can have. At present it’s a set of ideas that fit together and are somewhat compelling, but unfortunately nobody really knows how to use this in a technical way to be able to precisely confirm it. That hasn’t changed in 20 years. In the meantime, theoretical physicists have gone off in the important direction of quantum gravity and holography. 

Possible string-theory solutions

What do you mean by holography in the string-theory context?

Holography predates the idea of the landscape. It was based on Bekenstein’s observation that the entropy of a black hole is proportional to the area of the horizon and not the volume of the black hole. It conjectures that the 3D world of ordinary experience is an image of reality coded on a distant 2D surface. A few years after the holographic principle was first conjectured, two precise versions of it were discovered; so called M(atrix) theory in 1996 and Maldacena’s “AdS/CFT” correspondence in 1997. The latter has been especially informative. It holds that there is a holographic duality between anti-de Sitter space formulated in terms of string theory, and quantum field theories that are similar to those that describe elementary particles. I don’t think string theory and holography are inconsistent with each other. String theory is a quantum theory that contains gravity, and all quantum mechanical gravity theories have to be holographic. String theory and holographic theory could well be the same thing. 

Almost anything we learn will be a large fraction of what we know

One of the things that troubles me about the standard model of cosmology, with inflation and a positive cosmological constant, is that the world, or at least the portion of it that we see, is de Sitter space. We do not have a good quantum understanding of de Sitter space. If we ultimately learn that de Sitter space is impossible, that would be very interesting. We are in a situation now that is similar to 20 years ago, where very little progress has been made in the quantum foundations of cosmology and in particular in the so-called measurement problem, where we don’t know how to use these ideas quantitatively to make predictions. 

What does the measurement problem have to do with it? 

The usual methodology of physics, in particular quantum mechanics, is to imagine systems that are outside the systems we are studying. We call these systems observers, apparatuses or measuring devices, and we sort of divide the world into those measuring devices and the things we’re interested in. But it’s quite clear that in the world of cosmology/de Sitter space/eternal inflation, that we’re all part of the same thing. And I think that’s partly why we are having trouble understanding the quantum mechanics of these things. In AdS/CFT, it’s perfectly logical to think about observers outside the system or observers on the boundary. But in de Sitter space there is no boundary; there’s only everything that’s inside the de Sitter space. And we don’t really understand the foundations or the methodology of how to think about a quantum world from the inside. What we’re really lacking is the kind of precise examples we have in the context of anti-de Sitter space, which we can analyse. This is something I’ve been looking for, as have many others including Witten, without much success. So that’s the downside: we don’t know very much.

What about the upsides? 

The upside is that almost anything we learn will be a large fraction of what we know. So there’s potential for great developments by simply understanding a few things about the quantum mechanics of de Sitter space. When I talk about this to some of my young friends, they say that de Sitter space is too hard. They are afraid of it. People have been burned over the years by trying to understand inflation, eternal inflation, de Sitter space, etc, so it’s much safer to work on anti-de Sitter space. My answer to that is: yes, you’re right, but it’s also true that a huge amount is known about anti-de Sitter space and it’s hard to find new things that haven’t been said before, whereas in de Sitter space the opposite is true. We will see, or at least the young people will see. I am getting to the point where it is hard to absorb new ideas.

To what extent can the “swampland” programme constrain the landscape?

The swampland is a good idea. It’s the idea that you can write down all sorts of naive semi-classical theories with practically infinite options, but that the consistency with quantum mechanics constrains the things that are possible, and those that violate the constraints are called the swampland. For example, the idea that there can’t be exact global symmetries in a quantum theory of gravity, so any theory you write down that has gravity and has a global symmetry in it, without having a corresponding gauge symmetry, will be in the swampland. The weak-gravity conjecture, which enables you to say something about the relative strengths of gauge forces and gravity acting on certain particles, is another good idea. It’s good to try to separate those things you can write down from a semi-classical point of view and those that are constrained by whatever the principles of quantum gravity are. The detailed example of the cosmological constant I am much less impressed by. The argument seems to be: let’s put a constraint on parameters in cosmology so that we can put de Sitter space in the swampland. But the world looks very much like de Sitter space, so I don’t understand the argument and I suspect people are wrong here.

What have been the most important and/or surprising physics results in your career?

I had one big negative surprise, as did much of the community. This was a while ago when the idea of “technicolour” – a dynamical way to break electroweak symmetry via new gauge interactions – turned out to be wrong. Everybody I knew was absolutely convinced that technicolour was right, and it wasn’t. I was surprised and shocked. As for positive surprises, I think it’s the whole collection of ideas called “it from qubit”. This has shown us that quantum mechanics and gravity are much more closely entangled with each other than we ever thought, and that the apparent difficulty in unifying them was because they were already unified; so to separate and then try to put them back together using the quantisation technique was wrong. Quantum mechanics and gravity are so closely related that in some sense they’re almost the same thing. I think that’s the message from the past 20 – and in particular the past 10 – years of it–from-qubit physics, which has largely been dominated by people like Maldacena and a whole group of younger physicists. This intimate connection between entanglement and spatial structure – the whole holographic and “ER equals EPR” ideas – is very bold. It has given people the ability to understand Hawking radiation, among other things, which I find extremely exciting. But as I said, and this is not always stated, in order to have real confidence in the results, it all ultimately rests on the assumption of theories that have exact supersymmetry. 

What are the near-term prospects to empirically test these ideas?

One extremely interesting idea is “quantum gravity in the lab” – the idea that it is possible to construct systems, for example a large sphere of material engineered to support surface excitations that look like conformal field theory, and then to see if that system describes a bulk world with gravity. There are already signs that this is true. For example, the recent claim, involving Google, that two entangled quantum computers have been used to send information through the analogue of a wormhole shows how the methods of gravity can influence the way quantum communication is viewed. It’s a sign that quantum mechanics and gravity
are not so different.

Do you have a view about which collider should follow the LHC? 

You know, I haven’t done real particle physics for a long time. Colliders fall into two categories: high-precision e+e colliders and high-energy proton–proton ones. So the question is: do we need a precision Higgs factory at the TeV scale or do we want to search for new phenomena at higher energies? My prejudice is the latter. I’ve always been a “slam ‘em together and see what comes out” sort of physicist. Analysing high-precision data is always more clouded. But I sure wouldn’t like anyone to take my advice on this too seriously.

τ-lepton polarisation measured in Z-boson decays

CMS figure 1

Precision electroweak measurements are a powerful way to probe new physics, through the indirect effects predicted by quantum field theory. The electroweak mixing angle θWeff is particularly sensitive to new phenomena related to electroweak symmetry breaking and the Brout–Englert–Higgs mechanism. It was measured at LEP in different processes; and at the LHC, thanks to the large number of collected events with Z-boson decays, the experiments can probe these effects with comparable sensitivity.

The CMS collaboration has reported a new measurement of the tau-lepton polarisation in the decay of Z bosons to a pair of tau leptons in proton–proton collisions at 13 TeV. The polarisation is defined as the asymmetry between the cross sections for the production of τ with positive and negative helicities, and is directly related to the electroweak mixing angle via the relation Pτ ≈ –2(1–4 sin2θWeff). The polarisation of the tau lepton is determined from the angular distributions of the visible tau decay products, leptonic or hadronic, with respect to the τ flight direction or relative to each other. A so-called optimal polarisation observable is constructed using all of these angular properties of the tau decay products. Since the spin states in Z0ττ+ are almost 100% anti-correlated, the sensitivity is improved by combining the spin observables of both τ leptons of the pair.

CMS figure 2

The average polarisationPτis obtained by a template fit to the observed optimal τ-polarisation observables, using tau-lepton pairs with an invariant mass in the range 75–120 GeV. As summarised in figure 1, the best sensitivity of Pτ is found in the channel where one tau decays to a muon and the other decays hadronically, thanks to the good selection efficiency and reconstruction of the spin observable in this channel. The fully hadronic final state suffers from higher trigger thresholds, which lead to fewer events and distortions of the templates.

The average τ polarisation is corrected to the value at the Z pole, Pτ (Z0) = –0.144 ± 0.006 (stat.) ± 0.014 (syst.), where the systematic uncertainty is dominated by the incorrect identification of the products of hadronically decaying tau leptons. The effective weak mixing angle is then determined as sin2θWeff= 0.2319 ± 0.0019, in agreement with the Standard Model (SM) prediction. Figure 2 compares the tau– lepton asymmetry parameter (the negative of the polarisation) with results from previous experiments, demonstrating that the CMS measurement is nearly as precise as those of single LEP experiments.

This measurement shows that LHC collision events, although much more complex than those collected at LEP, can provide precise determinations of the polarisation of the τ lepton, as well as of spin correlations between τ-lepton pairs. Such measurements are crucial to probe the CP properties of the Higgs boson’s Yukawa coupling to τ leptons, which is an important step in the path to understand the Higgs sector of the SM.

bright-rec iop pub iop-science physcis connect