High-energy heavy-ion collisions at the LHC exhibit strong collective flow effects in the azimuthal angle distribution of final-state particles. Since these effects are governed by the initial collision geometry of the two colliding nuclei and the hydrodynamic evolution of the collision, the study of anisotropic flow is a powerful way to characterise the production of the quark–gluon plasma (QGP) – an extreme state of matter expected to have existed in the early universe.
To their surprise, researchers on the ALICE experiment have now revealed similar flow signatures in small systems encompassing proton–proton (pp) and proton–lead (pPb) collisions, where QGP formation was previously assumed not to occur. The origin of the flow signals in small systems (and in particular whether the mechanisms behind these correlations in small systems share commonalities with heavy-ion collisions) are not yet fully understood. To better interpret these results, and thus to understand the limit of the system size that exhibits fluid-like behaviour, it is important to carefully single out possible scenarios that can mimic the effect of collective flow.
Anisotropic-flow measurements become more difficult in small systems because non-flow effects, such as the presence of jets, become more dominant. Thus, it is important to examine methods where non-flow effects are properly subtracted first. One of the methods, the so-called low-multiplicity template fit, has been widely used in several experiments to determine and subtract the non-flow elements.
The origin of the flow signals in small systems is not yet fully understood
The ALICE collaboration studied long-range angular correlations for pairs of charged particles produced in pp and pPb collisions at centre-of-mass energies of 13 TeV and 5.02 TeV, respectively. Flow coefficients were extracted from these correlations using the template-fit method in samples of events with different charged-particle multiplicities. This method considers that the yield of jet fragments increases as a function of particle multiplicity and allows physicists to examine assumptions made in the low-multiplicity template fit for the first time – demonstrating their validity, including a possible jet-shape modification.
Figure 1 shows the measurement of two components of anisotropic flow – elliptic (v2) and triangular (v3) – as a function of charged-particle multiplicity at midrapidity (Nch). The data show decreasing trends towards lower multiplicities. In pp collisions, the results suggest that the v2 signal disappears below Nch = 10. The results are then compared with hydrodynamic models. To accurately describe the data, especially for events with low multiplicities, a better understanding of initial conditions is needed.
These results can help to constrain the modelling of initial-state simulations, as the significance of initial-state effects increases for collisions resulting in low multiplicities. The measurements with larger statistics from Run 3 data will push down this multiplicity limit and reduce the associated uncertainties.
Quarks and gluons are the only known elementary particles that cannot be seen in isolation. Once produced, they immediately start a cascade of radiation (the parton shower), followed by confinement, when the partons bind into (colour-neutral) hadrons. These hadrons form the jets that we observe in detectors. The different phases of jet formation can help physicists understand various aspects of quantum chromodynamics (QCD), from parton interactions to hadron interactions – including the confinement transition leading to hadron formation, which is particularly difficult to model. However, jet formation cannot be directly observed. Recently, theorists proposed that the footprints of jet formation are encoded in the energy and angular correlations of the final particles, which can be probed through a set of observables called energy correlators. These observables record the largest angular distance between N particles within a jet (xL), weighted by the product of their energy fractions.
The CMS collaboration recently reported a measurement of the energy correlators between two (E2C) and three (E3C) particles inside a jet, using jets with pT in the 0.1–1.8 TeV range. Figure 1 (top) shows the measured E2C distribution. In each jet pT range, three scaling regions can be seen, corresponding to three stages in jet-formation evolution: parton shower, colour confinement and free hadrons (from right to left). The opposite E2C trends in the low and high xL regions indicate that the interactions between partons and those between hadrons are rather different; the intermediate region reflects the confinement transition from partons to hadrons.
Theorists have recently calculated the dynamics of the parton shower with unprecedented precision. Given the high precision of the calculations and of the measurements, the CMS team used the E3C over E2C ratio, shown in figure 1 (bottom), to evaluate the strong coupling constant αS. The ratio reduces the theoretical and experimental uncertainties, and therefore minimises the challenge of distinguishing the effects of αS variations from those of changes in quark–gluon composition. Since αS depends on the energy scale of the process under consideration, the measured value is given for the Z-boson mass: αS = 0.1229 with an uncertainty of 4%, dominated by theory uncertainties and by the jet-constituent energy-scale uncertainty. This value, which is consistent with the world average, represents the most precise measurement of αS using a method based on jet evolution.
Following a decision taken during the June session of the CERN Council to launch a technical design study for a new high-intensity physics programme at CERN’s North Area, a recommendation for experiment(s) that can best take advantage of the intense proton beam on offer is expected to be made by the end of 2023.
The design study concerns the extraction of a high-intensity beam from the Super Proton Synchrotron (SPS) to deliver up to a factor of approximately 20 more protons per year to ECN3 (Experimental Cavern North 3). It is an outcome of the Physics Beyond Colliders (PBC) initiative, which was launched in 2016 to explore ways to further diversify and expand the CERN scientific programme by covering kinematical domains that are complementary to those accessible to high-energy colliders, with a focus on programmes for the start of operations after Long Shutdown 3 towards the end of the decade.
CERN is confident in reaching the beam intensities required for all experiments
To employ a high-intensity proton beam at a fixed-target experiment in the North Area and to effectively exploit the protons accelerated by the SPS, the beam must be extracted slowly. In contrast to fast extraction within a single turn of the synchrotron, which utilises kicker magnets to change the path of a passing proton bunch, slow extraction gradually shaves the beam over several hundred thousand turns to produce a continuous flow of protons over a period of several seconds. One important limitation to overcome concerns particle losses during the extraction, foremost on the thin electrostatic extraction septum of the SPS but also along the transfer line leading to the North Area target stations. An R&D study backed by the PBC initiative has shown that it is possible to deflect the protons away from the blade of the electrostatic septum using thin, bent crystals. “Based on the technical feasibility study carried out in the PBC Beam Delivery ECN3 task force, CERN is confident in reaching the beam intensities required for all experiments,” says ECN3 project leader Matthew Fraser.
Currently, ECN3 hosts the NA62 experiment, which searches for ultra-rare kaon decays as well as for feebly-interacting particles (FIPs). Three experimental proposals that could exploit a high-intensity beam in ECN3 have been submitted to the SPS committee, and on 6 December the CERN research board is expected to decide which should be taken forward. The High-Intensity Kaon Experiment (HIKE), which requires an increase of the current beam intensity by a factor of between four and seven, aims to increase the precision on ultra-rare kaon decays to further constrain the Cabibbo–Kobayashi–Maskawa unitarity triangle and to search for decays of FIPs that may appear on the same axis as the dumped proton beam. Looking for off-axis FIP decays, the SHADOWS (Search for Hidden And Dark Objects With the SPS) programme could run alongside HIKE when operated in beam-dump mode. Alternatively, the SHiP (Search for Hidden Particles) experiment would investigate hidden sectors such as heavy neutral leptons in the GeV mass range and also enable access to muon- and tau-neutrino physics in a dedicated beam-dump facility installed in ECN3.
The ambitious programme to provide and prepare the high-intensity ECN3 facility for the 2030s onwards is driven in synergy with the North Area consolidation project, which has been ongoing since Long Shutdown 2. Works are planned to be carried out without impacting the other beamlines and experiments in the North Area, with first beam commissioning of the new facility expected from 2030.
“Once the experimental decision has been made, things will move quickly and the experimental groups will be able to form strong collaborations around a new ECN3 physics facility, upgraded with the help of CERN’s equipment and service groups,” says Markus Brugger, co-chair of the PBC ECN3 task force.
Innovation often becomes a form of competition. It can be thought of as a race among creative people, where standardized tools measure progress toward the finish line. For many who strive for technological innovation, one such tool is the vacuum gauge.
High-vacuum and ultra-high-vacuum (HV/UHV) environments are used for researching, refining and producing many manufactured goods. But how can scientists and engineers be sure that pressure levels in their vacuum systems are truly aligned with those in other facilities? Without shared vacuum standards and reliable tools for meeting these standards, key performance metrics – whether for scientific experiments or products being tested – may not be comparable. To realize a better ionization gauge for measuring pressure in HV/UHV environments, INFICON of Liechtenstein used multiphysics modelling and simulation to refine its product design.
A focus on gas density
The resulting Ion Reference Gauge 080 (IRG080) from INFICON is more accurate and reproducible when compared with existing ionization gauges. Development of the IRG080 was coordinated by the European Metrology Programme for Innovation and Research (EMPIR). This collaborative R&D effort by private companies and government research organizations aims to make Europe’s “research and innovation system more competitive on a global scale”. The project participants, working within EMPIR’S 16NRM05 Ion Gauge project, considered multiple options before agreeing that INFICON’s gauge design best fulfilled the performance goals.
Of course, different degrees of vacuum require their own specific approaches to pressure measurement. “Depending on conditions, certain means of measuring pressure work better than others,” explained Martin Wüest, head of sensor technology at INFICON. “At near-atmospheric pressures, you can use a capacitive diaphragm gauge. At middle vacuum, you can measure heat transfer occurring via convection.” Neither of these approaches is suitable for HV/UHV applications. “At HV/UHV pressures, there are not enough particles to force a diaphragm to move, nor are we able to reliably measure heat transfer,” added Wüest. “This is where we use ionization to determine gas density and corresponding pressure.”
The most common HV/UHV pressure-measuring tool is a Bayard–Alpert hot-filament ionization gauge, which is placed inside the vacuum chamber. The instrument includes three core building blocks: the filament (or hot cathode), the grid and the ion collector. Its operation requires the supply of low-voltage electric current to the filament, causing it to heat up. As the filament becomes hotter, it emits electrons that are attracted to the grid, which is supplied with a higher voltage. Some of the electrons flowing toward and within the grid will collide with any free-floating gas molecules that are circulating in the vacuum chamber. Electrons that collide with gas molecules will form ions that then flow toward the collector, with the measurable ion current in the collector proportional to the density of gas molecules in the chamber.
“We can then convert density to pressure, according to the ideal gas law,” explained Wüest. “Pressure will be proportional to the ion current divided by the electron current, [in turn] divided by a sensitivity factor that is adjusted depending on what gas is in the chamber.”
Better by design
Unfortunately, while the operational principles of the Bayard–Alpert ionization gauge are sound and well understood, their performance is sensitive to heat and rough handling. “A typical ionization gauge contains fine metal structures that are held in spring-loaded tension,” said Wüest. “Each time you use the device, you heat the filament to between 1200 and 2000 °C. That affects the metal in the spring and can distort the shape of the filament, [thereby] changing the starting location of the electron flow and the paths the electrons follow.”
At the same time, the core components of a Bayard–Alpert gauge can become misaligned all too easily, introducing measurement uncertainties of 10 to 20% – an unacceptably wide range of variation. “Most vacuum-chamber systems are overbuilt as a result,” noted Wüest, and the need for frequent gauge recalibration also wastes precious development time and money.
With this in mind, the 16NRM05 Ion Gauge project team set a measurement uncertainty target of 1% or less for its benchmark gauge design (when used to detect nitrogen gas). Another goal was to eliminate the need to recalibrate gas sensitivity factors for each gauge and gas species under study. The new design also needed to be unaffected by minor shocks and reproducible by multiple manufacturers.
To achieve these goals, the project team first dedicated itself to studying HV/UHV measurement. Their research encompassed a broad review of 260 relevant studies. After completing their review, the project partners selected one design that incorporates current best practice for ionization gauge design: INFICON’s IE514 extractor-type gauge. Subsequently, three project participants – at NOVA University Lisbon, CERN and INFICON – each developed their own simulation models of the IE514 design. Their results were compared to test results from a physical prototype of the IE514 gauge to ensure the accuracy of the respective models before proceeding towards an optimized gauge design.
Computing the sensitivity factor
Francesco Scuderi, an INFICON engineer who specializes in simulation, used the COMSOL Multiphysics® software to model the IE514. The model enabled analysis of thermionic electron emissions from the filament and the ionization of gas by those electrons. The model can also be used for comray-tracing the paths of generated ions toward the collector. With these simulated outputs, Scuderi could calculate an expected sensitivity factor, which is based on how many ions are detected per emitted electron – a useful metric for comparing the overall fidelity of the model with actual test results.
“After constructing the model geometry and mesh, we set boundary conditions for our simulation,” Scuderi explained. “We are looking to express the coupled relationship of electron emissions and filament temperature, which will vary from approximately 1400 to 2000 °C across the length of the filament. This variation thermionically affects the distribution of electrons and the paths they will follow.”
He continued: “Once we simulate thermal conditions and the electric field, we can begin our ray-tracing simulation. The software enables us to trace the flow of electrons to the grid and the resulting coupled heating effects.”
Next, the model is used to calculate the percentage of electrons that collide with gas particles. From there, ray-tracing of the resulting ions can be performed, tracing their paths toward the collector. “We can then compare the quantity of circulating electrons with the number of ions and their positions,” noted Scuderi. “From this, we can extrapolate a value for ion current in the collector and then compute the sensitivity factor.”
INFICON’s model did an impressive job of generating simulated values that aligned closely with test results from the benchmark prototype. This enabled the team to observe how changes to the modelled design affected key performance metrics, including ionization energy, the paths of electrons and ions, emission and transmission current, and sensitivity.
The end-product of INFICON’s design process, the IRG080, incorporates many of the same components as existing Bayard–Alpert gauges, but key parts look quite different. For example, the new design’s filament is a solid suspended disc, not a thin wire. The grid is no longer a delicate wire cage but is instead made from stronger formed metal parts. The collector now consists of two components: a single pin or rod that attracts ions and a solid metal ring that directs electron flow away from the collector and toward a Faraday cup (to catch the charged particles in vacuum). This arrangement, refined through ray-tracing simulation with the COMSOL Multiphysics® software, improves accuracy by better separating the paths of ions and electrons.
A more precise, reproducible gauge
INFICON, for its part, built 13 prototypes for evaluation by the project consortium. Testing showed that the IRG080 achieved the goal of reducing measurement uncertainty to below 1%. As for sensitivity, the IRG080 performed eight times better than the consortium’s benchmark gauge design. Equally important, the INFICON prototype yielded consistent results during multiple testing sessions, delivering sensitivity repeatability performance that was 13 times better than that of the benchmark gauge. In all, 23 identical gauges were built and tested during the project, confirming that INFICON had created a more precise, robust and reproducible tool for measuring HV/UHV conditions.
“We consider [the IRG080] a good demonstration of [INFICON’s] capabilities,” said Wüest.
Entanglement is an extraordinary feature of quantum mechanics: if two particles are entangled, the state of one particle cannot be described independently from the other. It has been observed in a wide variety of systems, ranging from microscopic particles such as photons or atoms to macroscopic diamonds, and over distances ranging from the nanoscale to hundreds of kilometres. Until now, however, entanglement has remained largely unexplored at the high energies accessible at hadron colliders, such as the LHC.
At the TOP 2023 workshop, which took place in Michigan this week, the ATLAS collaboration reported a measurement of entanglement using top-quark pairs with one electron and one muon in the final state selected from proton–proton collision data collected during LHC Run 2 at a centre-of-mass energy of 13 TeV, opening new ways to test the fundamental properties of quantum mechanics.
Two-qubit system The simplest system which gives rise to entanglement is a pair of qubits, as in the case of two spin-1/2 particles. Since top quarks are typically generated in top-antitop pairs (tt) at the LHC, they represent a unique high-energy example of such a two-qubit system. The extremely short lifetime of the top (10-25 s, which is shorter than the timescale for hadronisation and spin decorrelation) means that its spin information is directly transferred to its decay products. Close to threshold, the tt pair produced through gluon fusion is almost in a spin-singlet state, maximally entangled. By measuring the angular distributions of the tt decay products close to threshold, one can therefore conclude whether the tt pair is in an entangled state.
For this purpose, a single observable can be used as an entanglement witness, D. This can be measured from the distribution of cos?, where ? is the angle between the charged lepton directions in each of the parent top and anti-top rest frames, with D = −3⋅⟨cos?⟩. The entanglement criterion is given by D = tr(C)/3 < −1/3, where tr(C) is the sum of the diagonal elements of the spin-correlation matrix C of the tt̄ pair before hadronisation effects occur. Intuitively, this criterion can be understood from the fact that tr(C) is the expectation value of the product of the spin polarizations, tr(C) =〈σ⋅σ〉, with σ, σ being the t,t polarizations, respectively (classically tr(C) ≤ 1, since spin polarizations are unit vectors). D is measured in a region where the invariant mass is approximately twice the mass of the top quark, 340 < mtt < 380 GeV, and is performed at particle level, after hadronisation effects occur.
This constitutes the first observation of entanglement between a pair of quarks and the highest-energy measurement of entanglement
The shape of cos? is distorted by detector and event-selection effects for which it has to be corrected. A calibration curve connecting the value of D before and after the event reconstruction is extracted from simulation and used to derive D from the corresponding measurement, which is then compared to predictions from state-of-the-art Monte Carlo simulations. The measured value D = -0.547 ± 0.002 (stat.) ± 0.021 (syst.) is well beyond 5σ from the non-entanglement hypothesis. This constitutes the first-ever observation of entanglement between a pair of quarks and the highest-energy measurement of entanglement.
Apart from the intrinsic interest of testing entanglement under unprecedented conditions, this measurement paves the way to use the LHC as a novel facility to study quantum information. Prime examples are quantum discord, which is the most basic form of quantum correlations; quantum steering, which is how one subsystem can steer the state of the other one; and tests of Bell’s inequalities, which explore non-locality. Furthermore, borrowing concepts from quantum information theory inspires new approaches to search for physics beyond the Standard Model.
Electronics engineer Oscar Barbalat, who pioneered knowledge-transfer at CERN, died on 8 September 2023, aged 87.
Born in Liège, Belgium in 1935, Oscar joined CERN in 1961, working initially in the Proton Synchrotron (PS) radio-frequency (RF) group. At the time, the PS beam intensity was still below 109 protons per pulse and the beam-control system was somewhat difficult to master, even though the operations consisted mainly of striking internal targets at 24 GeV/c. The control system became increasingly complex when the PS slow-resonant extraction system of Hugh Hereward was put into service. As part of a team of expert accelerator physicists that included Dieter Möhl, Werner Hardt, Pierre Lefèvre and Aymar Sörensen, Oscar wrote a substantial FORTRAN simulation program to understand how the extraction efficiency depended on its numerous correlated parameters.
In the 1970s, the PS division set out to digitise the controls of all PS subsystems (Linac, PS Booster, RF, beam transport systems, vacuum system, beam observation, etc). These subsystems used independent control systems, which were based on different computers or operated manually. Oscar was tasked with devising a structured naming scheme for all the components of the PS complex. After producing several versions, in collaboration with all the experts, the fourth iteration of his proposed scheme was adopted in 1977. To design the scheme, Oscar used the detailed knowledge he had acquired of the accelerator systems and their control needs. His respectful and friendly but tenacious way with colleagues enabled him to explore their desires and problems, which he was then able to reconcile with the needs of the automated controls. Oscar was modest. In the acknowledgements of his naming scheme, he wrote: “This proposal is the result of numerous contributions and suggestions from the many members of the division who were interested in this problem and the author is only responsible for the inconsistencies that remain.”
On Giorgio Brianti’s initiative, following the interest of the CERN Council’s finance committee, the “Bureau de Liaison pour l’Industrie et la Technologie” (BLIT) was founded, with Oscar in charge. His activity began in 1974 and ended on his retirement in 1997. His approach to this new task was typical of his and CERN’s collaborative style: low-key and constructive. He was eager to inform himself of details and he had a talent for explaining technical aspects to others. It helped that he was well educated with broad interests in people, science, technology, languages along with cultural and societal purposes. He built a network of people who helped him and whom he convinced of the relevance of sharing technological insights beyond CERN.
After more than 20 years developing this area, he summarised the activities, successes and obstacles in Technology Transfer from Particle Physics, the CERN Experience 1974–1997. When activities began in the 1970s, few considered the usefulness of CERN technologies outside particle physics as a relevant objective. Now, CERN prominently showcases its impact on society. After his retirement, Oscar continued to be interested in CERN technology-transfer, and in 2012 he became a founding member of the international Thorium Energy Committee (iThEC), promoting R&D in thorium energy technologies.
No doubt, Oscar is the pioneer of what is now known as knowledge-transfer at CERN.
The 20th International Conference on B-Physics at Frontier Machines,Beauty 2023, was held in Clermont-Ferrand, France, from 3-7 July, hosted by the Laboratoire de Physique de Clermont (IN2P3/CNRS, Université Clermont Auvergne). It was the first in-person edition of the series since the pandemic, and attracted 75 participants from all over the world. The programme had 53 invited talks of which 13 were theoretical overviews. An important element was also the Young Scientist Forum, with 7 short presentations on recent results.
The key focus of the conference series is to review the latest results in heavy-flavour physics and discuss future directions. Heavy-flavour decays, in particular those of hadrons that contain b quarks, offer powerful probes of physics beyond the Standard Model (SM). Beauty 2023 took place 30 years after the opening meeting in the series. A dedicated session was devoted to reflections on the developments in flavour physics over this period, and also celebrating the life of Sheldon Stone, who passed away in October 2021. Sheldon was both an inspirational figure in flavour physics as a whole, a driving force behind the CLEO, BTeV and LHCb experiments, and a long-term supporter of the Beauty conference series.
LHC results Many important results have emerged from the LHC since the last Beauty conference. One concerns the CP-violating parameter sin2β, for which measurements by the BaBar and Belle experiments at the start of the millennium marked the dawn of the modern flavour-physics era. LHCb has now measured sin2β with a precision better than any other experiment, to match its achievement for ϕs, the analogous parameter in Bs0 decays, where ATLAS and CMS have also made a major contribution. Continued improvements in the knowledge of these fundamental parameters will be vital in probing for other sources of CP violation beyond the SM.
Over the past decade, the community has been intrigued by strong hints of the breakdown of lepton-flavour universality, one of the guiding tenets of the SM, in B decays. Following a recent update from LHCb, it seems that lepton universality may remain a good symmetry, at least in the class of electroweak-penguin decays such as B→K(*)l+l–, where much of the excitement was focused (CERN Courier January/February 2023 p7). Nonetheless, there remain puzzles to be understood in this sector of flavour physics, and anomalies are emerging elsewhere. For example, non-leptonic decays of the kind Bs→ Ds +K– show intriguing patterns through CP-violation and decay-rate information.
The July conference was noteworthy as being a showcase for the first major results to emerge from the Belle II experiment. Belle II has now collected 362 fb-1 of integrated luminosity on the Υ(4S) resonance, which constitutes a dataset similar in size to that accumulated by BaBar and the original Belle experiment, and results were shown from early tranches of this sample. In some cases, these results already match or exceed in sensitivity and precision what was achieved at the first generation of B-factory experiments, or indeed elsewhere. These advances can be attributed to improved instrumentation and analysis techniques. For example, world-leading measurements of the lifetimes of several charm hadrons were presented, including the D0, D+, Ds+ and Λc+. Belle II and its accelerator, SuperKEKB, will emerge from a year-long shutdown in December with the goal to increase the dataset by a factor of 10-20 in the coming half decade.
Full of promise The future experimental programme of flavour physics is full of promise. In addition to the upcoming riches expected from Belle II, an upgraded LHCb detector is being commissioned in order to collect significantly larger event samples over the coming decade. Upgrades to ATLAS and CMS will enhance these experiments’ capabilities in flavour physics during the High-Luminosity LHC era, for which a second upgrade to LHCb is also foreseen. Conference participants also learned of the exciting possibilities for flavour physics at the proposed future collider FCC-ee, where samples of several 1012 Z0 decays will open the door to ultra-precise measurements in an analysis environment much cleaner than at the LHC. These projects will be complemented by continued exploration of the kaon sector, and studies at the charm threshold for which a high-luminosity Super Tau Charm Factory is proposed in China.
The scientific programme of Beauty 2023 was complemented by outreach events in the city, including a `Pints of Science’ evening and a public lecture, as well as a variety of social events. These and the stimulating presentations made the conference a huge success, demonstrating that flavour remains a vibrant field and continues to be a key player in the search for new physics beyond the Standard Model.
Henri Navelet died on 3 July 2023 in Bordeaux, at the age of 84. Born on 28 October 1938, he studied at the École Normale Supérieure in Paris. He went on to become a specialist in strong interactions and was a leading member of the Service de Physique Théorique (SPhT, now Institut de Physique Théorique) of CEA Saclay since its creation in 1963. Henri stood out for his theoretical rigour and remarkable computational skills, which meant a great deal to his many collaborators.
In the 1960s, Henri was a member of the famous “CoMoNav” trio with two other SPhT researchers, Gilles Cohen-Tannoudji and André Morel. The trio was famous in particular for introducing the so-called Regge-pole absorption model into the phenomenology of high-energy (at the time!) strong interactions. This model was used by many physicists to untangle the multitude of reactions studied at CERN. Henri’s other noteworthy contributions include his work with Alfred H Mueller on very-high-energy particle jets, today commonly referred to as “Mueller-Navelet jets”, which are still the subject of experimental research and theoretical calculations in quantum chromodynamics.
Henri had a great sense of humour and human qualities that were highly motivating for his colleagues and the young researchers who met him during his long career. He was not only a great theoretical physicist, but also a passionate sportsman, training the younger generations. In particular, he ran the marathon in two hours, 59 minutes and 59 seconds. A valued researcher and friend has left us.
It was with deep sadness we learned that Roger Bailey, who played a key role in the operation of CERN’s accelerators, passed away on 1 June while mountain biking in Valais, Switzerland. He was 69.
Roger began his career with a doctorate in experimental particle physics from the University of Sheffield in 1979, going on to a postdoctoral position at the Rutherford Appleton Laboratory until 1983. Throughout this time, he worked on experiments at CERN’s Super Proton Synchrotron (SPS) and was based at CERN from 1977. In 1983 he joined the SPS operations group, where he was responsible for accelerator operations until 1989. Roger then moved to the Large Electron Positron collider (LEP), coordinating the team’s efforts through the commissioning phase and subsequent operation, and became operations group leader in the late 1990s.
After LEP shut down in 2000, Roger became progressively more involved in the Large Hadron Collider (LHC), planning and building the team for commissioning with beam. He then took a leading role in the LHC’s early operation, helping to push the LHC’s performance to Higgs-discovery levels before becoming director of the CERN Accelerator School, sharing his wealth of experience and inspiring new generations of accelerator physicists.
Those of us who worked with Rog invariably counted him as a friend: it made perfect sense, given his calm confidence, his kindness and his generosity of spirit. He was straightforward but never outspoken and his well-developed common sense and pragmatism were combined with a subtle and wicked deadpan sense of humour. We had a lot of fun over the years in what were amazing times for CERN. Looking back, things he said, and did, can still make us chuckle, even in the sadness of his untimely passing. Rog had a passionate, playful eye for life’s potential and he wasn’t shy. There was an adventurous spirit at work, be it in the mountains or the streets of New York, Berlin or Chicago. His specialities were tracking down music and talking amiably to anyone.
During a service to celebrate Roger’s life on 16 June, a poem of his called It’s a Wrap was read by his daughter Ellie, revealing a physicist’s philosophical view on life and the universe. Two of his favourite quotes were on the order of service: Mae West’s “You only live once, but if you do it right, once is enough” and Einstein’s “Our death is not an end if we can live on in our children and the younger generation. For they are us, our bodies are only wilted leaves on the tree of life.” Another, by Hunter S Thompson, was mentioned in a homage given by his son, Rob: “Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!”
Jack Heron always liked the idea of being an inventor. After completing a master’s in electronics engineering at Durham University, he spent a year in Bangalore, India as part of the “Engineers Without Borders” programme, where he designed solar-powered poverty-alleviation solutions in unelectrified slums. This sparked an interest in renewable energy, and he completed a PhD on smart grid techniques in 2020. With a passion for advanced technology and engineering at the peak of performance, he then joined the “digital twin” R&D programme of international defence company Babcock, dedicated to fault-prediction for defence assets in land, sea and air.
“The military is extremely interested in autonomous vehicles,” explains Jack. “But removing the driver from, say, a fleet of tanks, increases the number of breakdowns: many maintenance checks are triggered by the driver noticing, for example, a ‘funny noise on start-up’, or ‘a smell of oil in the cabin’.” Jack worked on trying to replicate this intuition by using very early signs in sensor signals. Such a capability permits high confidence in mission success, he adds. “It also ensures that during a mission, if circumstances change, dynamic asset information is available for reconfiguration.”
Working in defence was “exciting and fast- paced” and enabled Jack to see his research put to practical use – he got to drive a tank and attend firing tests on a naval frigate. “It’s especially interesting because the world of defence is something most people don’t have visibility on. Modern warfare is constantly evolving based on technology, but also politics and current affairs, and being on the cusp of that is really fascinating.” It also left him with a wealth of transferrable skills: “Defence is a high-performance world where product failure is not an option. This is hardcoded into the organisation from the bottom up.”
Back to his roots
Growing up in Geneva, CERN always had a mythical status for Jack as the epitome of science and exploration. In 2022 he applied for a senior fellowship. “Just getting interviewed for this fellowship was a huge moment for me,” he says. “I was lucky enough to get interviewed in person, and when I arrived I got a visitor pass with the CERN-logo lanyards attached. Even if I didn’t get the job I was going to frame it, just to remember being interviewed at CERN!”
I love the idea of working on the frontiers of science and human understanding
Jack now works on the “availability challenge” for the proposed Future Circular Collider FCC-ee. Availability is the percentage of scheduled physics days the machine is able to deliver beam, (i.e. is not down for repair). To meet physics goals, this must be 80%. The LHC – the world’s largest and most complex accelerator, but still a factor three smaller and simpler than the FCC – had an availability of 77% during Run 2. “Modern-day energy-frontier particle colliders aren’t built to the availabilities we would need to succeed with the FCC, and that’s without consideringadditional technical challenges,” notes Jack. His research aims to break down this problem system by system and find solutions, beginning with the radio frequency (RF).
On the back of an envelope, he says, the statistics are a concern: “The LHC has 16 superconducting RF cavities, which trip about once every five days. If we scale this up to FCC-ee numbers (136 cavities for the Z-pole energy mode and 1352 for the tt threshold), this becomes problematic. Orders of magnitude greater reliability is required, and that itself is a defining technical challenge.
Jack’s background in defence prepared him well for this task: “Both are systems that cannot afford to fail, and therefore have extremely tight reliability requirements. One hour of down time in the LHC is extremely costly, and the FCC will be no different.”
Mirroring what he did at Babcock, one solution could be fault prediction. Others are robot maintenance, and various hardware solutions to make the RF circuit more reliable. “Generally speaking, I love the idea of working on the frontiers of science and human understanding. I find this exploration extremely exciting, and I’m delighted to be a part of it.”
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.