Topics

Roger Bailey 1954–2023

It was with deep sadness we learned that Roger Bailey, who played a key role in the operation of CERN’s accelerators, passed away on 1 June while mountain biking in Valais, Switzerland. He was 69. 

Roger began his career with a doctorate in experimental particle physics from the University of Sheffield in 1979, going on to a postdoctoral position at the Rutherford Appleton Laboratory until 1983. Throughout this time, he worked on experiments at CERN’s Super Proton Synchrotron (SPS) and was based at CERN from 1977. In 1983 he joined the SPS operations group, where he was responsible for accelerator operations until 1989. Roger then moved to the Large Electron Positron collider (LEP), coordinating the team’s efforts through the commissioning phase and subsequent operation, and became operations group leader in the late 1990s.

After LEP shut down in 2000, Roger became progressively more involved in the Large Hadron Collider (LHC), planning and building the team for commissioning with beam. He then took a leading role in the LHC’s early operation, helping to push the LHC’s performance to Higgs-discovery levels before becoming director of the CERN Accelerator School, sharing his wealth of experience and inspiring new generations of accelerator physicists.

Those of us who worked with Rog invariably counted him as a friend: it made perfect sense, given his calm confidence, his kindness and his generosity of spirit. He was straightforward but never outspoken and his well-developed common sense and pragmatism were combined with a subtle and wicked deadpan sense of humour. We had a lot of fun over the years in what were amazing times for CERN. Looking back, things he said, and did, can still make us chuckle, even in the sadness of his untimely passing. Rog had a passionate, playful eye for life’s potential and he wasn’t shy. There was an adventurous spirit at work, be it in the mountains or the streets of New York, Berlin or Chicago. His specialities were tracking down music and talking amiably to anyone.

During a service to celebrate Roger’s life on 16 June, a poem of his called It’s a Wrap was read by his daughter Ellie, revealing a physicist’s philosophical view on life and the universe. Two of his favourite quotes were on the order of service: Mae West’s “You only live once, but if you do it right, once is enough” and Einstein’s “Our death is not an end if we can live on in our children and the younger generation. For they are us, our bodies are only wilted leaves on the tree of life.” Another, by Hunter S Thompson, was mentioned in a homage given by his son, Rob: “Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!” 

Way to go, Rog, way to go.

Fault-finding across sectors

Jack Heron always liked the idea of being an inventor. After completing a master’s in electronics engineering at Durham University, he spent a year in Bangalore, India as part of the “Engineers Without Borders” programme, where he designed solar-powered poverty-alleviation solutions in unelectrified slums. This sparked an interest in renewable energy, and he completed a PhD on smart grid techniques in 2020. With a passion for advanced technology and engineering at the peak of performance, he then joined the “digital twin” R&D programme of international defence company Babcock, dedicated to fault-prediction for defence assets in land, sea and air.

“The military is extremely interested in autonomous vehicles,” explains Jack. “But removing the driver from, say, a fleet of tanks, increases the number of breakdowns: many maintenance checks are triggered by the driver noticing, for example, a ‘funny noise on start-up’, or ‘a smell of oil in the cabin’.” Jack worked on trying to replicate this intuition by using very early signs in sensor signals. Such a capability permits high confidence in mission success, he adds. “It also ensures that during a mission, if circumstances change, dynamic asset information is available for reconfiguration.”

Working in defence was “exciting and fast- paced” and enabled Jack to see his research put to practical use – he got to drive a tank and attend firing tests on a naval frigate. “It’s especially interesting because the world of defence is something most people don’t have visibility on. Modern warfare is constantly evolving based on technology, but also politics and current affairs, and being on the cusp of that is really fascinating.” It also left him with a wealth of transferrable skills: “Defence is a high-performance world where product failure is not an option. This is hardcoded into the organisation from the bottom up.”

Back to his roots

Growing up in Geneva, CERN always had a mythical status for Jack as the epitome of science and exploration. In 2022 he applied for a senior fellowship. “Just getting interviewed for this fellowship was a huge moment for me,” he says. “I was lucky enough to get interviewed in person, and when I arrived I got a visitor pass with the CERN-logo lanyards attached. Even if I didn’t get the job I was going to frame it, just to remember being interviewed at CERN!”

I love the idea of working on the frontiers of science and human understanding

Jack now works on the “availability challenge” for the proposed Future Circular Collider FCC-ee. Availability is the percentage of scheduled physics days the machine is able to deliver beam, (i.e. is not down for repair). To meet physics goals, this must be 80%. The LHC – the world’s largest and most complex accelerator, but still a factor three smaller and simpler than the FCC – had an availability of 77% during Run 2. “Modern-day energy-frontier particle colliders aren’t built to the availabilities we would need to succeed with the FCC, and that’s without considering  additional technical challenges,” notes Jack. His research aims to break down this problem system by system and find solutions, beginning with the radio frequency (RF). 

On the back of an envelope, he says, the statistics are a concern: “The LHC has 16 superconducting RF cavities, which trip about once every five days. If we scale this up to FCC-ee numbers (136 cavities for the Z-pole energy mode and 1352 for the tt threshold), this becomes problematic. Orders of magnitude greater reliability is required, and that itself is a defining technical challenge.

Jack’s background in defence prepared him well for this task: “Both are systems that cannot afford to fail, and therefore have extremely tight reliability requirements. One hour of down time in the LHC is extremely costly, and the FCC will be no different.”

Mirroring what he did at Babcock, one solution could be fault prediction. Others are robot maintenance, and various hardware solutions to make the RF circuit more reliable. “Generally speaking, I love the idea of working on the frontiers of science and human understanding. I find this exploration extremely exciting, and I’m delighted to be a part of it.”

Toichiro Kinoshita 1925–2023

Goldwin Smith Professor of Physics at Cornell University, Toichiro Kinoshita, whose precise calculations of the electron and the muon g-factors remain milestones in the field, passed away on 23 March in Amherst, Massachusetts, at the age of 98.

Toichiro “Tom” Kinoshita was born in Tokyo and raised in Tottori, the western rural part of Japan. When Tokyo Imperial University (now the University of Tokyo) reopened in 1946 after World War II, he returned to the campus. He was interested in theoretical particle physics, and asked Kunihiko Kodaira, an associate professor of mathematics and physics, to supervise him. He also attended the weekly seminars held by Sin-Itiro Tomonaga, who went on to share the Nobel Prize (with Schwinger and Feynman) for his fundamental work in quantum electrodynamics, QED. Equipped with a strong understanding of the renormalisation of QED from Tomonaga’s group, he began to investigate the renormalisability of charged vector bosons and collaborated with Yoichiro Nambu. Both were accepted by the Institute for Advanced Study at Princeton (IAS) as postdoctoral fellows in 1952. After a two-year stay at IAS, Kinoshita became a postdoc at Columbia University and within a year he was offered a research associate position at Cornell University, where he spent the rest of his career.

In 1966 Kinoshita revisited CERN. During a tour on the first day, a “wiggle plot” posted on the wall captured his gaze. An experiment to measure the anomalous magnetic moment of the muon (muon g-2) at CERN was in progress, and Kinoshita immediately dropped off the tour and rushed to the library to look for the known QED results on charge-renormalisation constants. His earlier study of the mass singularity brought him the inspiration that the leading contribution of the sixth-order term for muon g-2 can be obtained by simply multiplying the already-known quantities. It was the moment he started his life-long quest on the QED calculation of the lepton g-2 values.

After returning to Cornell, he started investigating all the sixth-order QED vertex diagrams. Due to their complexity he chose numerical means to evaluate them, his earlier studies on the three-body variational calculation of the helium atom in the 1950s helping him on his way. The new rules were invented to construct pointwise-subtraction terms for ultraviolet and infrared divergences of a Feynman integral, with which renormalisation could be realised without breaking the gauge symmetry of QED. After the first result came out in 1972, he kept improving the precision of the sixth-order term. His results were used for a quarter of a century to compare the theory to the measurement and to determine the fine-structure constant, among other calculations.

Retired from Cornell University in 1995, Kinoshita was delighted not to have duties other than physics research and started calculating the 10th-order QED contribution to g-2. We joined the 10th-order project in 2004, and his passion was the driving force of our collaboration. He visited RIKEN in Japan every year, and while he was in the US we received an e-mail every morning notifying us of his progress, queries and suggestions. The first entire 10th-order results were published in 2012. His final paper came out in 2019. He wished to be active until he was 100 years old and almost accomplished it.

Tom was always polite and calm, even when physics discussions heated up, and had an inner passion for the pursuit of truth that persisted throughout his life. He is survived by his three daughters and six grandchildren. His wife Masako, a well-known expert in Japanese loop braiding, passed away in 2022. Two diagrams are engraved on their tombstone: a Feynman diagram of QED at 10th order, and a loop-manipulation method. The family also established two undergraduate scholarships at Cornell in their memory.

Probing gluonic saturated matter

ALICE figure 1

To advance our understanding of gluonic saturated matter at the LHC, the ALICE collaboration has presented a new study using photon-induced interactions in ultra-peripheral collisions (UPCs). In this type of collision, one beam emits a very high energetic photon that strikes the other beam, giving rise to photon–proton, photon–nucleus and even photon–photon collisions. 

While we know that the proton – and most of the visible matter of the universe – is made of quarks bound together by gluons, quantum chromodynamics (QCD)  has not yet provided a complete understanding of the rich physics phenomena that occur in high-energy interactions involving hadrons. For example, it is not known how the distribution of gluons evolve at low values of Bjorken-x. The rapid increase in gluon density observed with decreasing x cannot continue forever as it would eventually violate unitarity. At some point “gluon saturation” must set in to curb this growth.

So far, it has been challenging to experimentally establish when saturation sets in. One can expect, however, that it should occur at lower energies for heavy nuclei than for protons. Thus, the ALICE Collaboration has studied the energy dependence of UPC processes for both protons and heavy nuclei. At the same time, other physics phenomena, such as gluon shadowing originating from multi-scattering processes, can exist with similar experimental signatures. The interplay between these phenomena is still an open problem in QCD.

ALICE has presented new results on J/ψ meson-production UPC, where the photon probes the whole nucleus. The new ALICE results, analysed using LHC Run 1 and Run 2 data, probe a wide range of photon-nucleus collision energies from around 10 GeV to 1000 GeV. These results confirm previous measurements by ALICE, obtained at lower energies, that indicated a strong nuclear suppression when such photon–nucleus data are compared to expectations from photon–proton interactions. The present analysis employs novel methods for extracting the energy dependence, providing new information to test theo­retical models. The present data at high energies can be described by both saturation-based and gluon shadowing models. The coherent J/ψ meson production at low energy, in the anti-shadowing region, is not described by these models, nor can available models fully describe the energy dependence of this process over the explored energy range.

ALICE will continue to investigate these phenomena in LHC Runs 3 and 4, where high-precision measurements with larger data samples and upgraded detectors will provide more powerful tools to better understand gluonic saturated matter.

CP studies open windows on new physics

LHCb figure 1

Charge-parity (CP) violation parameters in tree-dominated b → c c s quark transitions are a powerful probe of physics beyond the Standard Model (SM). When B0(s) and B0(s) mesons decay through these transitions to the same final-state particles, an interference between mixing and decay amplitudes occurs, making these processes particularly sensitive to CP violation.

In the SM, B0(s)B0(s) mixing is possible because the flavour eigenstates are not the (physical) mass eigenstates: a neutral B meson, once produced, evolves as a quantum superposition of B0(s) and B0(s) states. Due to this time-dependent mixing amplitude, an interference between mixing and decay amplitudes can lead to an observable time-dependent CP asymmetry in the decay rates. It was through the observation of this phenomenon in the “golden mode” B0→ J/ψ K0s that, in 2001, the BaBar and Belle collaborations reported the first unequivocal evidence for CP violation in B decays, for which Kobayashi and Maskawa were awarded the 2008 Nobel Prize in Physics.

As the 3 × 3 Cabibbo–Kobayashi–Maskawa (CKM) matrix that describes quark mixing in the SM is expected to be unitary, it leads to relations among its complex elements. These can be represented as triangles in a complex plane, all of them with the same area (which is a measure of the amount of CP violation in the SM). The most famous of them, the so-called unitary triangle, has sides of roughly the same size and internal angles denoted as α, β and γ. Since individually none of the CKM parameters are predicted by theory, the search for new physics relies on over-constraining them by looking for any hint of internal inconsistency. For that, precision is the key.

LHCb has become a major actor in precision studies of CP violation

Having analysed the full proton–proton collision data set with 13 TeV, and adding it to previous measurements at 7 and 8 TeV, LHCb recently brought the CP-violating parameters in B0→ J/ψ K0s  and in another golden channel, B0s→ J/ψ K+K, to a new level of precision. These parameters (sin2β and φs, respectively) are predicted with high accuracy through global CKM fits and, given their clean experimental signatures, are paramount for new-physics searches. The measured time-dependent CP asymmetry of B0 and B0 decay rates is shown in figure 1 with the resulting amplitude proportional to sin2β. Similarly, the update of the B0s→ J/ψ K+K analysis with the 13 TeV data resulted in the world’s most precise φs measurement. Both angles agree with SM expectations and with previous measurements.

These legacy results for sin2β and φs from the first LHC runs represent a new milestone in LHCb’s hunt for physics beyond the SM. Along with the world-leading determination of γ (with a current precision of less than four degrees), and the discovery of CP violation in charm in 2019, LHCb has fulfilled and exceeded its own goals of more than a decade ago, becoming the major actor in precision studies of CP violation. LHCb is taking data with a brand new detector at larger interaction rates than before, boosting the experimental sensitivity and tightening the grip around the Standard Model.

PHYSTAT systematics at BIRS

Ann Lee

The PHYSTAT series of seminars and workshops provides a unique meeting ground for physicists and statisticians. The latest in-person meeting, after previously being postponed due to COVID, covered the field of systematic errors (sometimes known as nuisance parameters), which are becoming increasingly important in particle physics as larger datasets reduce statistical errors in many analysis channels. Taking place from 23 to 28 April at the Banff International Research Station (BIRS) in the Canadian Rockies, the workshop attracted 42 delegates working not only on the LHC experiments but also on neutrino physics, cosmic-ray detectors and astrophysics.

The organisers had assigned half of the time to discussions, and that time was used. Information flowed in both directions: physicists learned about the Wasserstein distance and statisticians learned about jet energy scales. The dialogue was constructive and positive – we have moved on from the “Frequentist versus Bayesian” days and now everyone is happy to use both – and the discussions continued during coffee, dinner and hikes up the nearby snow-covered mountains. 

Our understanding of traditional problems continues to grow. The “signal plus background” problem always has new features to surprise us, unfolding continues to present challenges, and it seems we always have more to learn about simple concepts like errors and significance. There were also ideas that were new to many of us. Optimal transport and the Monge problem provide a range of tools whose use is only beginning to be appreciated, while neural networks and other machine-learning techniques can be used to help find anomalies and understand uncertainties. The similarities and differences between marginalisation and profiling require exploration, and we probably need to go beyond the asymptotic formulae more often than we do in practice.

Another “Banff challenge”, the third in a sequence, was set by Tom Junk of Fermilab. The first two had a big impact on the community and statistical practice. This time Tom provided simulated data for which contestants had to find the signal and background sizes, using samples with several systematic uncertainties – these uncertainties were unspecified, but dark hints were dropped. It’s an open competition and anyone can try for the glory of winning the challenge.

Collaborations were visibly forming during the latest PHYSTAT event, and results will be appearing in the next few months, not only in papers but in practical procedures and software that will be adopted and used in the front line of experimental research.

This and other PHYSTAT activities continue, with frequent seminars and several workshops (zoom, in-person and hybrid) in the planning stage.

A treasure trove of LHC results

About 350 physicists attended the 11th edition of the Large Hadron Collider Physics (LHCP) conference in Belgrade, Serbia from 22 to 26 May. The first-in person edition since 2019, the conference triggered productive discussions between experimentalists and theorists across the full LHC physics programme. It also addressed the latest progress of the High-Luminosity LHC upgrades and future-collider developments, in addition to outreach, diversity and education. The conference took place in parallel with the successful restart of LHC Run 3, and saw about 40 new results released for the first time.

The initial physics results from the Run 3 dataset collected in 2022 by ATLAS and CMS were shown, featuring the first measurement of the Higgs-boson production cross-section by ATLAS at 13.6 TeV. Clearly the Run 2 dataset is still a gold mine for the LHC experiments. The programme of precision measurements of Higgs-boson properties is continuing with improved accuracy from the full Run 2 dataset. In particular, ATLAS and CMS reported a new combined result targeting the rare decay H → Zγ, for which they found evidence at the level of 3.4σ and a measured rate slightly higher but comparable to that predicted by the Standard Model.

Innovative signatures

Searches for physics beyond the Standard Model (SM) remains a very active field of research at the LHC, with many innovative signatures explored, including those of long-lived particles. Some of these searches use new anomaly-detection techniques and explore potential lower-production cross sections. A new search of leptoquarks by CMS exploiting the leptonic tau content of the proton was reported, while ATLAS reported a search for stau production in supersymmetry models with much improved sensitivity. Many other searches were also presented, and while a few low-level excesses exist, more data will be required to check if these are statistical fluctuations or not.

The SM is under intense scrutiny but is still very successful at the high-energy frontier. A recent re-analysis of the W-boson mass by ATLAS with the 7 TeV dataset shows good agreement with SM predictions, unlike the CDF result released in 2022. Validating the model used for the ATLAS W-mass measurement, new precise measurements of the W and Z bosons’ transverse momentum distributions were reported by ATLAS using Run 2 data collected under lower pileup conditions. Vector-boson scattering processes are an important probe of the electroweak symmetry breaking mechanism, and most such processes are now observed at the LHC.

Exploring the top-quark sector, many recent results focused on rare top-production processes. Four-top production was observed recently by ATLAS and CMS. First evidence for the rare tWZ production mode was shown by CMS at LHCP 2023. Some of these rare production modes are seen with rates somewhat higher than predicted, and more data will be required to conclude if the differences are significant. Top production is also used to investigate more exotic scenarios. A new CMS result, measuring the tt production cross section as a function of sidereal time, was reported. No indication of Lorentz invariance violation is observed.

Presentations covered the broad spectrum of physics at the LHC brilliantly

On the flavour-physics side, LHCb reported a new precise measurement of CP violation in the “golden” B → J/ψ Ks decay, with the most precise extraction of the beta angle of the CKM quark-mixing matrix (see p16). Recent LHCb results on the flavour “anomalies” no longer show an indication for lepton universality violation in B → Ke+e compared to B → +μ decay rates, but some puzzles remain and there is still some tension in the tau-to-muon ratio in the tree-level decays B → B(*)τ(µ)ν. Lepton-flavour violation is investigated in a new CMS result searching for the forbidden τ→ 3µ decays, where an upper limit close to the Belle result was reported.

Characterisation of the quark–gluon plasma is actively studied using PbPb collision data. New results from ALICE regarding investigations of jet-quenching properties as well as charm fragmentation studies were shown at the conference.

The recent detections of collider-produced neutrinos by the new FASER and SND experiments were also presented, marking the start of a new physics programme at the LHC.

Broad spectrum

Several theory presentations highlighted recent progress in SM predictions for a wide range of processes including the electroweak sector, top-quark and Higgs-boson productions, as well as linking LHC physics to lattice QCD computations – work that is vital to fully exploit the physics potential of the LHC. Open questions in the various sectors were summarised and prospects for new-physics searches in Run 3, including those related to the Higgs-boson sector, were discussed. Links between LHC physics and dark matter were also highlighted, with examples of light dark-matter models and feebly interacting particles. Effective field theories, which are key tools to probe new physics in a generic way, were described with emphasis on the complementarity with searches targeting specific models.

Overall, the presentations covered the broad spectrum of physics at the LHC brilliantly. Future data, including from the High-Luminosity LHC phase, should allow physicists to continue to address many of the field’s open questions. Next year’s LHCP conference will be held at Northeastern University in Boston.

Precision progress on the Higgs boson

ATLAS figure 1

Since the discovery of the Higgs boson in 2012, its di-photon and four-lepton decays have played a crucial role in characterising its properties. Despite their small branching ratios, these decay channels are ideal for accurate measurements due to the excellent resolution and efficient identification of photons and leptons provided by the ATLAS detector.

The Higgs-boson mass (mH) is a free parameter of the Standard Model (SM) that must be determined experimentally. Its value governs the coupling strengths of the Higgs boson with the other SM particles. It also enters as logarithmic corrections to the SM predictions of the W-boson mass and effective weak mixing angle, whose precise measurements allow the electroweak model to be tested. Moreover, the Higgs mass determines the shape and energy evolution of the Brout–Englert–Higgs potential and thus the stability of the electroweak vacuum. A precise measurement of mH is therefore of paramount importance.

ATLAS has recently published a new result of the Higgs-boson mass in the H → γγ decay channel using proton–proton collision data from LHC Run 2 (2015–2018). The measurement requires a careful control of systematic uncertainties, primarily arising from the photon energy scale. The new analysis has achieved a substantial reduction by more than a factor of three of these uncertainties compared to the previous ATLAS result based on the 2015 and 2016 dataset. That improvement became possible after extensive efforts to refine the photon energy-scale calibration and associated uncertainties.

ATLAS figure 2

The calibration benefited from an improved understanding of the energy response across the longitudinal ATLAS electromagnetic calorimeter layers and of nonlinear electronics readout effects. A new correction was implemented in the extrapolation of the precisely measured electron-energy scale in Z → e+e events to photons, to account for differences in the lateral shower development between electrons and photons. These improvements reduced the systematic uncertainty in the mass measurement by about 40%. Moreover, the extrapolation of the electron energy scale from Z → e+e events to photons originating from the Higgs boson was further refined, and transverse-momentum dependent effects were corrected. Taken together, the improvements allowed ATLAS to measure the Higgs-boson mass in the di-photon channel with a precision of 1.1 per mille.

The new di-photon result was combined with the mH measurement in the H → ZZ*→ 4 decay using the full Run 2 dataset, published by ATLAS in 2022, and with the corresponding Run 1 (2011–2012) measurements (see figure 1). The resulting combined Higgs-boson mass mH = 125.11 ± 0.11 GeV has a precision of 0.9 per mille and is dominated by statistical uncertainties that will further reduce with the Run 3 data.

The high level of readiness and excellent performance of the ATLAS detector also allowed first measurements of the fiducial Higgs-boson production cross-sections in the H → γγ and H → ZZ*→ 4 decay channels using up to 31.4 fb–1 of data collected in 2022. Their extrapolation to full phase space and combination gives σ(pp → H) = 58.2 ± 8.7 pb, which agrees with the SM prediction of 59.9 ± 2.6 pb (see figure 2).

With the continuation of Run 3 data taking, the precision of the 13.6 TeV cross-section measurements will improve and the combination with the Run 2 data will allow the exploration of Higgs-boson properties with growing sensitivity.

Looking forward at the LHC

Proposed Forward Physics Facility

The Forward Physics Facility (FPF) is a proposed new facility to operate concurrently with the High-Luminosity LHC, housing several new experiments on the ATLAS collision axis. The FPF offers a broad, far-reaching physics programme ranging from neutrino, QCD and hadron-structure studies to beyond-the-Standard Model (BSM) searches. The project, which is being studied within the Physics Beyond Colliders initiative, would exploit the pre-existing HL-LHC beams and thus have minimal energy-consumption requirements.

On 8 and 9 June, the 6th workshop on the Forward Physics Facility was held at CERN and online. Attracting about 160 participants, the workshop was organised in sessions focusing on the facility design, the proposed experiments and physics studies, leaving plenty of time for discussion about the next steps.

Groundbreaking

Regarding the facility itself, CERN civil-engineering experts presented its overall design: a 65 m-long, 10 m-high/wide cavern connected to the surface via an 88 m-deep shaft. The facility is located 600 m from the ATLAS collision point, in the SM18 area of CERN. A workshop highlight was the first results from a site investigation study, whereby a 20 cm-diameter core was taken at the proposed location of the FPF shaft to a depth of 100 m. The initial analysis of the core showed that the geological conditions are positive for work in this area. Other encouraging studies towards confirming the FPF feasibility were FLUKA simulations of the expected muon flux in the cavern (the main background for the experiments), the expected radiation level (shown to allow people to enter the cavern during LHC operations with various restrictions), and the possible effect on beam operations of the excavation works. One area where more work is required concerns the possible need to install a sweeper magnet in the LHC tunnel between ATLAS and the FPF to reduce the muon backgrounds.

Currently there are five proposed experiments to be installed in the FPF: FASER2 (to search for decaying long-lived particles); FASERν2 and AdvSND (dedicated neutrino detectors covering complementary rapidity regions); FLArE (a liquid-argon time projection chamber for neutrino physics and light dark-matter searches); and FORMOSA (a scintillator-based detector to search for milli-charged particles). The three neutrino detectors offer complementary designs to exploit the huge number of TeV energy neutrinos of all flavours that would be produced in such a forward-physics configuration. Four of these have smaller pathfinder detectors, FASER(ν), SND@LHC and milliQan that are already operating during LHC Run 3. First results from these pathfinder experiments were presented at the CERN workshop, including the first ever direct observation of collider neutrinos by FASER and SND@LHC, which provide a key proof of principle for the FPF. The latest conceptual design and expected performance of the FPF experiments were presented. Furthermore, first ideas on models to fund these experiments are in place and were discussed at the workshop.

In the past year, much progress has been made in quantifying the physics case of the FPF. It effectively extends the LHC with a “neutrino–ion collider’’ with complementary reach to the Electron–Ion Collider under construction in the US. The large number of high-energy neutrino interactions that will be observed at the FPF allows detailed studies of deep inelastic scattering to constrain proton and nuclear parton distribution functions (PDFs). Dedicated projections of the FPF reveal that uncertainties in light-quark PDFs could be reduced by up to a factor of two or even more compared to current models, leading to improved HL-LHC predictions for key measurements such as the W-boson mass.

In the past year, much progress has been made in quantifying the physics case of the FPF

High-energy electrons and tau neutrinos at the FPF predominantly arise from forward charm production. This is initiated by gluon–gluon scattering involving very low and high momentum fractions, with the former reaching down to Bjorken-x values of 10–7 – beyond the range of any other experiment. The same FPF measurements of forward charm production are relevant for testing different models of QCD at small-x, which would be instrumental for Higgs production at the proposed Future Circular Collider (FCC-hh). This improved modeling of forward charm production is also essential for understanding the backgrounds to diffuse astrophysics neutrinos at telescopes such as IceCube and KM3NeT. In addition, measurements of the ratio of electron-to-muon neutrinos at the FPF probe forward kaon-to-pion production ratios that could explain the so-called muon puzzle (a deficit in muons in simulations compared to measurements), affecting cosmic-ray experiments.

The FPF experiments would also be able to probe a host of BSM scenarios in uncharted regions of parameter space, such as dark-matter portals, dark Higgs bosons and heavy neutral leptons. Furthermore, experiments at the FPF will be sensitive to the scattering of light dark-matter particles produced in LHC collisions, and the large centre-of-mass energy enables probes of models, such as quirks (long-lived particles that are charged under a hidden-sector gauge interaction), and some inelastic dark-matter candidates, which are inaccessible at fixed-target experiments. On top of that, the FPF experiments will significantly improve the sensitivity of the LHC to probe millicharged particles.

The June workshop confirmed both the unique physics motivation for the FPF and the excellent progress in technical and feasibility studies towards realising it. Motivated by these exciting prospects, the FPF community is now working on a Letter of Intent to submit to the LHC experiments committee as the next step.

Aligning future colliders at SLAC

The 2023 International Workshop on Future Linear Colliders (LCWS2023) took place at SLAC from 15 to 20 May, continuing the series devoted to the study of high-energy linear electron–positron colliders that started in 1992. A linear collider is appealing because it could operate as a Higgs factory during its initial stage, while maintaining a clear path for future energy upgrades. Proposed linear-collider Higgs factories are designed for greater compactness, energy efficiency and sustainability, with lowered construction and operation costs compared to circular machines.

With a wide programme of plenary and parallel sessions, the workshop was a great opportunity for the community to discuss current and future R&D directions, with a focus on sustainability, and was testament to the eagerness of physicists from all over the world to join forces to build the next Higgs factory. More than 200 scientists participated, about 30% of which were early-career researchers and industry partners.

Energy frontiers

As set out by the 2020 update of the European strategy for particle physics and the Energy Frontier report from Snowmass 2021, particle physicists agreed that precision Higgs-boson measurements are the best path toward further progress and to provide insights into potential new-physics interactions. The Higgs boson is central for understanding fundamental particles and interactions beyond the Standard Model. Examples include the nature of dark matter and matter–antimatter asymmetry, which led to the prevalence of matter in our universe. 

Ideally, data-taking at a future e+eHiggs factory should follow the HL-LHC directly, requiring construction to start by 2030, in parallel with HL-LHC data-taking. Any significant delay will put at risk the availability of essential and unique expertise, and human resources, and endanger the future of the field.

Among the e+e colliders being evaluated by the community, the International Linear Collider (ILC), based on superconducting RF technology, has the most advanced design. It is currently under consideration for construction in Japan. However, for a long time now, Japan has not initiated a process to host this collider. One alternative approach is to construct a large circular collider – a strategy now being pursued by CERN with the FCC-ee, and by China with the CEPC. Both colliders would require tunnels of about 100 km circumference to limit synchrotron radiation. The FCC-ee machine is foreseen to operate in 2048, seven years after the end of the HL-LHC programme, with a substantial cost in time and resources for the large tunnel. An alternative is to construct a compact linear e+e collider based on high-gradient acceleration. CERN has a longstanding R&D effort along these lines, CLIC, that would operate at a collision energy of 380 GeV. 

New technologies proposed for higher-energy stages will require decades of R&D

Given the global uncertainties around each proposal, it is prudent to investigate alternative plans based on technologies that could enable compact designs and possibly provide a roadmap to extend the energy reach of future colliders. As also highlighted in the Snowmass Energy Frontier report, consideration should be given to the timely realisation of a Higgs factory in the US as an international effort. For instance, the Cool Copper Collider (C3) is a new and even more compact proposal for a Higgs-producing linear collider. It was developed during Snowmass 2021 and made its debut at LCWS with more than 15 talks and five posters. This proposal would use normal-conducting RF cavities to achieve a collision energy of 500 GeV with an 8 km-long collider, making it significantly smaller and likely more cost-effective than other proposed Higgs factories.

There are many advantages of the linear approach. Among them, linear colliders are able to access energies of 500 GeV and beyond, while for circular e+e colliders the expected luminosity drops off above centre-of-mass energies of 350–400 GeV. This would allow precision measurements that are crucial for indirect searches for new physics, including measurements of the top-quark mass and electroweak couplings, the top-Higgs coupling, and the cross section for double-Higgs production.

At LCWS 2023, the community showed progress on R&D for both accelerator and detector technologies and outlined how further advances in ILC technology, as well as alternative technologies such as C3 and CLIC, promise lower costs and/or extended energy reach for later stages of this programme. Discoveries at a Higgs factory may point to specific goals for higher energy machines, with quark and lepton collisions at least 10 times the energies of the LHC. New technologies proposed for such higher-energy stages – using pp, muon and e+e colliders – will require decades of R&D. Construction and operation of a linear Higgs factory would be a key contribution towards this programme by developing an accelerator workforce and providing challenges to train young scientists.

In this regard, a key outcome of the SLAC workshop was a statement supporting the timely realisation of a Higgs factory based on a linear collider to access energies beyond 500 GeV and enable the measurements vital for new physics to the P5 committee, which is currently evaluating priorities in US high-energy physics for the next two decades.

bright-rec iop pub iop-science physcis connect