Comsol -leaderboard other pages

Topics

Hypertriton lifetime puzzle nears resolution

Fig. 1.

Hypernuclei are bound states of nucleons and hyperons. Studying their properties is one of the best ways to investigate hyperon–nucleon interactions and offers insights into the high-density inner cores of neutron stars, which favour the creation of the exotic nuclear states. Constraining such astrophysical models requires detailed knowledge of hyperon–nucleon and three-body hyperon–nucleon–nucleon interactions. The strengths of these interactions can be determined in collider experiments by precisely measuring the lifetimes of hypernuclei.

Hypernuclei are produced in significant quantities in heavy-ion collisions at LHC energies. The lightest, the hypertriton, is a bound state of a proton, a neutron and a Λ. With a Λ-separation energy of only ~130 keV, the average distance between the Λ and the deuteron core is 10.6 fm. This relatively large separation implies only a small perturbation to the Λ wavefunction inside the hypernucleus, and therefore a hypertriton lifetime close to that of a free Λ, 263.2 ± 2.0 ps. Most calculations predict the hypertriton lifetime to be in the range 213 to 256 ps.

The measured lifetimes were systematically below theoretical predictions

The first measurements of the hypertriton lifetime were performed in the 1960s and 1970s with imaging techniques such as photographic emulsions and bubble chambers, and were based on very small event samples, leading to large statistical uncertainties. In the last decade, however, measurements have been performed using the larger data samples of heavy-ion collisions. Though compatible with theory, the measured lifetimes were systematically below theoretical predictions: thus the so-called “lifetime puzzle”.

The ALICE collaboration has recently reported a new measurement of the hypertriton lifetime using Pb–Pb collisions at √sNN = 5.02 TeV, which were collected in 2015. The lifetime of the (anti-)hypertriton is determined by reconstructing the two-body decay channel with a charged pion, namely 3ΛH 3 He + π (3Λ̅ H3He + π+). The branching ratio of this decay channel, taken from the theoretical calculations, is 25%. The measured lifetime is 242+34–38(stat) ± 17 (syst) ps. This result shows an improved statistical resolution and reduced systematic uncertainty compared to previous measurements and is currently the most precise measurement. It is also in agreement with both theoretical predictions and the free-Λ lifetime, even within the statistical uncertainty. Combining this ALICE result with previous measurements gives a weighted average of 206+15–13ps (figure 1).

This result represents an important step forward in solving the longstanding hypertriton lifetime puzzle, since it is the first measurement with a large data sample that is close to theoretical expectations. Larger and more precise data sets are expected to be collected during LHC Runs 3 and 4, following the ongoing major upgrade of ALICE. This will allow a significant improvement in the quality of the present lifetime measurement, and the determination of the Λ binding energy with high precision. The combination of these two measurements has the potential to constrain the branching ratio for this decay, which cannot be determined directly without access to the neutral and non-mesonic decay channels. This will be a crucial step towards understanding if the now partially confirmed theoretical description of the hypertriton is finally resolved.

Flavour heavyweights converge on Ljubljana

The international conference devoted to b-hadron physics at frontier machines, Beauty 2019, was held in Ljubljana, Slovenia, from 30 September to 4 October. The aims of the conference series are to review the latest results in heavy-flavour physics and discuss future directions. This year’s edition, the 18th in the series, attracted around 80 scientists and 65 invited talks, of which 13 were theory based.

The study of hadrons containing beauty quarks, and other heavy flavours, offers a powerful way to probe for physics beyond the Standard Model, as highlighted in the inspiring opening talk by Chris Quigg (Fermilab). In the last few years much attention has been focused on b-physics results that do not show perfect agreement with the predictions of the theory. In particular, studies by Belle, BaBar and LHCb of the processes B→K+ and B0 →K*+ (where ℓ± indicates a lepton) in specific kinematic regions have yielded different decay rates for muon pairs and electron pairs, apparently violating lepton universality. For both processes the significance of the effect is around 2.5σ. Popular models to explain this and related effects include leptoquarks and new Z’ bosons, however no firm conclusions can be drawn until more precise measurements are available, which should be the case when the next Beauty meeting occurs.

Indications that φs is nonzero are starting to emerge

The B system is an ideal laboratory for the study of CP violation, and recent results were presented by the LHC experiments for φs – the phase associated with time-dependent measurements of Bs meson decays to CP eigenstates. Indications that φs is nonzero are starting to emerge, which is remarkable given that its magnitude in the Standard Model is less than 0.1 radians. This is great encouragement for Run 3 of the LHC, and beyond.

Heavy-flavour experiments are also well suited to the study of hadron spectroscopy. Many very recent results were shown at the conference including the discovery of the X(3842), which is a charmonium resonance above the open charm threshold, and new excited resonances seen in the Λbππ final state, which help map out the relatively unexplored world of b-baryons. The ATLAS collaboration presented, for the first time, an analysis of Λb→J/ψpK decays in which a structure is observed that is compatible with that of the LHCb pentaquark discovery of 2015, providing the first confirmation by another experiment of these highly exotic states.

Beyond beauty

The Beauty conference welcomes reports on flavour studies beyond b-physics, and a highlight of the week was the first presentation at a conference of new results on the measurement of the branching ratio of the ultra-rare decay K+→π+νν̄, by the NA62 collaboration. The impressive background suppression that the experiment has achieved left the audience in no doubt as to the sensitivity of the result that can be expected when the full data set is accumulated and analysed. Comparing the measurement with the predicted branching fraction of ~10-10 will be a critical test of the Standard Model in the flavour domain.

Flavour physics has a bright future. Several talks presented the first signals and results from the early running of the Belle II experiment, and precise and exciting measurements can be expected when the next meeting in the Beauty series takes place. In parallel, studies with increasing sensitivity will continue to emerge from the LHC. The meeting was updated about progress on the LHCb upgrade, which is currently being installed ready for Run 3, and will allow for an order of magnitude increase in b-hadron samples. The conference was summarised by Patrick Koppenburg (Nikhef), who emphasised the enormous potential of b-hadron studies for uncovering signs of new physics beyond the Standard Model.

The next edition of Beauty will take place in Japan, hosted by Kavli IPMU, University of Tokyo, in autumn 2020.

Debut for baryons in flavour puzzle

LHCb has launched a new offensive in the exploration of lepton-flavour universality – the principle that the weak interaction couples to electrons, muons and tau leptons equally. Following previous results that hinted that e+e pairs might be produced at a greater rate than μ+μ pairs in B-meson decays involving the b→sℓ+ transition (ℓ=e,μ), the study brings b-baryon decays to bear on the subject for the first time.

“LHCb certainly deserves to be congratulated on this nontrivial measurement,” said Jure Zupan of the University of Cincinnati, in the US. “It is very important that LHCb is trying to measure the same quark level transition b→sℓ+ with as many hadronic probes as possible. Though baryon decays are more difficult to interpret, the Standard Model prediction of equal rates is very clean and any significant deviation would mean the discovery of new physics.”

We are living in exciting but somewhat confusing times

Jure Zupan

The current intrigue began in 2014, when LHCb observed the ratio of B+→K+μ+μ to B+→K+e+e decays to be 2.6σ below unity – the so-called RK anomaly. The measurement was updated this year to be closer to unity, but with reduced errors the significance of the deviation – either a muon deficit or an electron surplus – remains almost unchanged at 2.5σ. The puzzle deepened in 2017 when LHCb measured the rate of B0→K*0μ+μ relative to B0→K*0e+e to be more than 2σ below unity in two adjacent kinematic bins – the RK* anomaly. In the same period, measurements of decays to D mesons by LHCb and the B-factory experiments BaBar and Belle consistently hinted that the b→cℓν̄ transition might occur at a greater rate for tau leptons relative to electrons and muons than expected in the Standard Model.

Baryons enter the fray

Now, in a preprint published on 18 December, the LHCb collaboration reports a measurement of the ratio of branching fractions for the highly suppressed baryonic decays Λb0→pKe+e and Λb0→pKμ+μ to be RpK-1 = 1.17+0.18-0.16 (stat) ± 0.07 (syst). The reciprocal ratio to that reported for the B-meson decays, the measurement is consistent with previous LHCb measurements in that it errs on the side of fewer b→sμ+μ than b→se+e transitions, though with no statistical significance for that hypothesis at the present time. The blind analysis was performed for an invariant mass squared of the lepton pairs ranging from 0.1 to 6.0 GeV2 – well below contributions from resonant J/ψ→ℓ+, with observations of the latter reaction used to drive down systematics related to the different experimental treatment of muons and electrons. J/ψ meson decays to μ+μ and e+e pairs are known to respect lepton universality at the 0.4% level.

“It’s very satisfying to have been able to make this lepton-flavour universality test with baryons – having access to the Run 2 data was key,” said analyst Yasmine Amhis of the Laboratoire de l’Accélérateur Linéaire in Orsay. The analysis, which also constitutes the first observation of the decay Λb0→pKe+e, exploits an integrated 4.7 fb-1 of data collected at 7, 8 and 13 TeV. “LHCb is also working on other tests of the flavour anomalies, such as an angular analysis of B0→K*0μ+μ, and updates of the lepton-flavour universality tests of RK and RK* to the full Run 2 dataset,” continued Amhis. “We’re excited to find out whether the pattern of anomalies stays or fades away.”

We’re excited to find out whether the pattern of anomalies stays or fades away

Yasmine Amhis

An important verification of the B-meson anomalies will be performed by the recently launched Belle II experiment, though it is not expected to weigh in on Λb0 decays, says Zupan. “I think it is fair to say that it is only after both Belle II and LHCb are able to confirm the anomalies that new physics will be established,” he says. “Right now, we are living in exciting but somewhat confusing times: is the neutral-current b→sℓ+ anomaly real? Is the charged-current b→cℓν̄ anomaly real? Are they connected? Only time will tell.”

Particle physics inspires all

Collage of images from CERN Open Days 2019

“It’s a huge place full of ideas to try in case something revolutionary turns up.”

“There is high-tech science going on. Youre trying to make applications to real life, such as non-destructive testing.”

Collage of images from CERN Open Days 2019

“We’re not here for the science, we’re here for the machines!”

“I thought it was all about programming, but you actually build things.”

Collage of images from CERN Open Days 2019

“CERN is here to test out how particles behave and exploring the limit of the universe, like matter (which we know) and antimatter (which we don’t).”

“If you understand materials and energy at the basic scale, you have a better chance of creating new energy sources and materials for the future.”

Collage of images from CERN Open Days 2019

“I read an article recently that said this was all a waste of taxpayers money, but now I am less sure because I have seen today that there are a lot of applications.”

Gaspar Barreira 1940–2019

Gaspar Barreira

Experimental particle physicist Gaspar Barreira, co-founder of the Portuguese Laboratory for Instrumentation and Experimental Particle Physics (LIP), passed away on 1 June. He was the Portuguese delegate to the CERN Council and to the SESAME Council, and was a strong proponent of international cooperation.

Gaspar’s life proceeded in cycles, each lived intensely with great energy and focus. He had a vision to foster progress, to change the world here and now. Each time, despite arriving as an outsider, he was able to make great impact thanks to his intelligence and capability to transmit enthusiasm. He always chose grand objectives: let’s build something that doesn’t exist at all in the country; let’s do something that was never done before. He was not afraid of dreaming, nor of obstacles.

Born in Braga, in the north of Portugal, Gaspar arrived in Lisbon at the age of 18 to study physics and mathematics. He fought against the dictatorship of Salazar, which gagged Portugal for more than 40 years until the Carnation Revolution of 25 April 1974, and was imprisoned more than once. In the early 1970s he taught himself electronics, and soon found himself at the Nuclear Physics Centre in Lisbon, saving the day for many colleagues with his ability to fix the scarce equipment or assemble non-existing parts. He also entered into pioneering collaborations with archaeologists to date ancient artefacts – a path that in 1980 led him to the International Centre for Theoretical Physics in Trieste, Italy, where he soon become director of the microprocessors laboratory.

In 1985 Gaspar retuned to Portugal to get involved in the county’s accession to CERN, founding LIP with José Mariano Gago and Armando Policarpo, and building LIP’s instrumentation division. NA38 at the SPS was the first experiment in which LIP participated as an institution. He greatly contributed to establish LIP as a reference laboratory in particle and astroparticle physics, instrumentation, technology and computing.

Gaspar was a strong believer in CERN and international cooperation. He had a fundamental role in bringing Portugal into the DELPHI experiment at LEP, and was a strong supporter of the LHC from the early days. He was a strong advocate of distributed computing, and did not spare efforts to have Portugal and LIP in the main projects in this area, at CERN and at a European level. Gaspar was responsible for the creation of the Portuguese Tier-2 in the CERN Worldwide LHC Computing Grid, and was active in several related initiatives.

From the turn of the century, Gaspar was fully involved in science policy. He was the Portuguese representative to a variety of international organisations and boards, and coordinated the Portuguese participation in the Alpha Magnetic Spectrometer for its shuttle flight in 1998. Gaspar was always particularly concerned with knowledge-transfer to society. He co-coordinated the training programmes for young Portuguese engineers at CERN, ESA and ESO, and the creation of the Portuguese language teachers programme.

Before and after the revolution of 1974, Gaspar worked towards the construction of a world where knowledge, freedom and rationality were decisive. We have lost a great friend of CERN, LIP and physics, an excellent scientist and a truly unique personality. Though departed, Gaspar leaves us an immense legacy of vision, endurance and resilience. His last big project, the installation in Portugal of a treatment and research centre for cancer therapy with protons, is not yet accomplished. For this we will strive.

Giovanni Muratori 1924–2019

Giovanni Muratori

Giovanni Muratori received a double degree in naval and mechanical engineering at the University of Genoa in 1949, after which he worked at ENI-AGIP on the construction of instruments for oil exploration. He started at CERN in August 1959 in the PS division, where he worked on the heavy-liquid bubble chamber designed to study neutrino physics. Giovanni oversaw the design of the cameras – not an easy task in view of the strong magnetic field that precluded the use of electric motors – and, after some initial setbacks, the chamber was ready for data-taking in early 1961. Finding the event rate to be insufficient, a crash programme was set in motion to improve the beam (using van der Meer’s magnetic horn) and to increase the total mass of detectors (by adding spark chambers downstream). Giovanni embarked on the design of the mechanics and optics for these spark chambers, which were operational in 1963.

At the end of 1961 he was transferred to the nuclear physics division and in April 1966 was appointed leader of the technical assistance group, which was involved in the design and construction of optical and mechanical equipment. The group developed and constructed a wide variety of detectors and associated equipment, including the R-108 experiment at the ISR where the group built a set of novel cylindrical drift chambers allowing track positions along the wire to be measured using the difference of arrival times of the signal at the ends of each wire. For NA31 the group built drift chambers installed in a helium-filled tank as well as a lightweight Kevlar window separating the helium from a vacuum tank.

Early-on, the group designed and constructed an automatic machine for winding large wire spark chambers and soon became specialised in the construction of arrays for the new multiwire proportional chambers. Led by Giovanni, the group developed equipment and facilities for Cherenkov detectors, including a dry lab for handling lithium foil and methods of producing precision glass spherical mirrors coated with highly reflecting aluminium coatings. Mirrors made using these techniques were later used in the RICH detector at LEP’s DELPHI experiment.

Towards the end of his CERN career he worked on the initial designs of the TPC detector for another LEP detector, ALEPH. He also started a collaboration with a group searching for the existence of a “fifth force” and designed and built a rotor that generated a dynamic gravitational field at around 450 Hz, which was used in the first absolute calibration of the gravitational wave detector EXPLORER at CERN.

Giovanni remained at CERN for several years after his retirement in 1986, during which time he worked on several problems including the initial design of a prototype liquid argon chamber for use in underground experiments at Gran Sasso. He was a superb engineer. His work was highly appreciated and his opinions respected. He participated actively in the design of equipment with innovative and ingenious ideas. He also loved solving machining and manufacturing problems, whether on a large or Swiss-watch scale. With his common-sense attitude and his warm and generous spirit, his advice was often sought on personal matters. Giovanni will be remembered with respect and affection by all.

Zooming in on top quarks

Fig. 1.

As the heaviest known particle, the top quark plays a unique role in the Standard Model (SM), making its presence felt in corrections to the masses of the W and Higgs bosons, and also, perhaps, in as-yet unseen physics beyond the SM. During Run 2 of the Large Hadron Collider (LHC), high-luminosity proton beams were collided at a centre-of-mass energy of 13 TeV. This allowed ATLAS to record and study an unprecedented number of collisions producing top–antitop pairs, providing ATLAS physicists with a unique opportunity to gain insights into the top quark’s properties.

ATLAS has measured the top–antitop production cross-section using events where one top quark decays to an electron, a neutrino and a bottom quark, and the other to a muon, a neutrino and a bottom quark. The striking eμ signature gives a clean and almost background-free sample, leading to a result with an uncertainty of only 2.4%, which is the most precise top-quark pair-production measurement to date. The measurement provides information on the top quark’s mass, and can be used to improve our knowledge of the parton distribution functions describing the internal structure of the proton. The kinematic distributions of the leptons produced in top-quark decays have also been precisely measured, providing a benchmark to test programs that model top-quark production and decay at the LHC (figure 1).

Fig. 2.

The mass of the top quark is a fundamental parameter of the SM, which impacts precision calculations of certain quantum corrections. It can be measured kinematically through the reconstruction of the top quark’s decay products. The top quark decays via the weak interaction as a free particle, but the resulting bottom quark interacts with other particles produced in the collision and eventually emerges as a collimated “b-jet” of hadrons. Modelling this process and calibrating the jet measurement in the detector limits the precision in many top-quark mass measurements, however, 20% of the b-jets contain a muon that carries information relating to the parent bottom quark. By combining this muon with an isolated lepton from a W-boson originating from the same top-quark decay, ATLAS has made a new measurement of the top quark mass with a much-reduced dependence on jet modelling and calibration. The result is ATLAS’s most precise individual top-quark mass measurement to date: 174.48 ± 0.78 GeV.

Higher order QCD diagrams translate this imbalance into the charge asymmetry

At the LHC, top and antitop quarks are not produced fully symmetrically with respect to the proton-beam direction, with top antiquarks produced slightly more often at large angles to the beam, and top quarks, which receive more momentum from the colliding proton, emerging closer to the axis. Higher order QCD diagrams translate this imbalance into the so-called charge asymmetry, which the SM predicts to be small (~0.6%), but which could be enhanced, or even suppressed, by new physics processes interfering with the known production modes. Using its full Run-2 data sample, ATLAS finds evidence of charge asymmetry in top-quark pair events with a significance of four standard deviations, confidently showing that the asymmetry is indeed non-zero. The measured charge asymmetry of 0.0060 ± 0.0015 is compatible with the latest SM predictions. ATLAS also measured the charge asymmetry versus the mass of the top–antitop system, further probing the SM (figure 2).

ALICE probes extreme electromagnetic fields

When two lead nuclei collide in the LHC at an energy of a few TeV per nucleon, an extremely strong magnetic field of the order 1014 –1015 T is generated by the spectator protons, which pass by the collision zone without breaking apart in inelastic collisions. The strongest yet probed by scientists, this magnetic field, and in particular the rate at which it decays, is interesting to study since it probes unexplored properties of the quark–gluon plasma (QGP), such as its electric conductivity. In addition, chiral phenomena such as the chiral magnetic effect are expected to be induced by the strong fields. Left–right asymmetry in the production of negatively and positively charged particles relative to the collision reaction plane is one of the observables that is directly sensitive to electromagnetic fields. This asymmetry, called directed flow (v1), is sensitive to two main competing effects: the Lorentz force experienced by charged particles (quarks) propagating in the magnetic field, and the Faraday effect – the quark current that is induced by the rapidly decreasing magnetic field. Charm quarks are produced in the early stages of heavy-ion collisions and are therefore more strongly affected by the electromagnetic fields than lighter quarks.

An extremely strong magnetic field of the order 1014 –1015 T is generated

The ALICE collaboration has recently probed this effect by measuring the directed flow, v1, for charged hadrons and D0/D0 mesons as a function of pseudorapidity (η) in mid-central lead–lead collisions at √sNN = 5.02 TeV. Head-on (most central) collisions were excluded from the analyses because in those collisions there are very few spectator nucleons (almost all nucleons interact inelastically), which leads to a weaker magnetic field.

ALICE extreme electromagnetic fields directed flow

The top-left panel of the figure shows the η dependence of v1 for charged hadrons (centrality class 5–40%). The difference Δv1 between positively and negatively charged hadrons is shown in the bottom-left panel. The η slope is found to be dΔv1/dη = 1.68 ± 0.49 (stat) ± 0.41 (syst) × 10–4   – positive at 2.6σ significance. This measurement has a similar order of magnitude to recent model calculations of the expected effect for charged pions, but with the opposite sign.

The right-hand panels show the same analysis for the neutral charmed mesons D0 (cū) and D0 (c̄u) (centrality class 10–40%). The measured directed flows are found to be about three orders of magnitude larger than for the charged hadrons, reflecting the stronger fields experienced immediately after the collision when the charm quarks are created. The slopes, which are seen to be positive for D0 and negative for D0, are opposite and larger than in the model calculations. The slope of the differences in the directed flows is dΔv1/dη = 4.9 ± 1.7 (stat) ± 0.6 (syst) × 10–1 – positive at 2.7σ significance (lower-right panel). Also, in this case, the sign of the observed slope is opposite with respect to model calculations, suggesting that the relative contributions of the Lorentz and Faraday effects in those calculations are not correct.

Together with recent observations at RHIC, these LHC measurements provide an intriguing first sign of the effect of the large magnetic fields experienced in heavy-ion collisions on final-state particles. Measurements with larger data samples in Run 3 will have a precision sufficient to allow the contributions of the Lorentz force and the Faraday effect to be separated.

CMS goes scouting for dark photons

Fig. 1.

One of the best strategies for searching for new physics in the TeV regime is to look for the decays of new particles. The CMS collaboration has searched in the dilepton channel for particles with masses above a few hundred GeV since the start of LHC data taking. Thanks to newly developed triggers, the searches are now being extended to the more difficult lower range of masses. A promising possible addition to the Standard Model (SM) that could exist in this mass range is the dark photon (ZD). Its coupling with SM particles and production rate depend on the value of a kinetic mixing coefficient ε, and the resulting strength of the interaction of the ZD with ordinary matter may be several orders of magnitude weaker than the electroweak interaction.

The CMS collaboration has recently presented results of a search for a narrow resonance decaying to a pair of muons in the mass range from 11.5 to 200 GeV. This search looks for a strikingly sharp peak on top of a smooth dimuon mass spectrum that arises mainly from the Drell–Yan process. At masses below approximately 40 GeV, conventional triggers are the main limitation for this analysis as the thresholds on the muon transverse momenta (pT), which are applied online to reduce the rate of events saved for offline analysis, introduce a significant kinematic acceptance loss, as evident from the red curve in figure 1.

Fig. 2.

A dedicated set of high-rate dimuon “scouting” triggers, with some additional kinematic constraints on the dimuon system and significantly lower muon pT thresholds, was deployed during Run 2 to overcome this limitation. Only a minimal amount of high-level information from the online reconstruction is stored for the selected events. The reduced event size allows significantly higher trigger rates, up to two orders of magnitude higher than the standard muon triggers. The green curve in figure 1 shows the dimuon invariant mass distribution obtained from data collected with the scouting triggers. The increase in kinematic acceptance for low masses can be well appreciated.

The full data sets collected with the muon scouting and standard dimuon triggers during Run 2 are used to probe masses below 45 GeV, and between 45 and 200 GeV, respectively, excluding the mass range from 75 to 110 GeV where Z-boson production dominates. No significant resonant peaks are observed, and limits are set on ε2 at 90% confidence as a function of the ZD mass (figure 2). These are among the world’s most stringent constraints on dark photons in this mass range.

Adapting to exascale computing

The CERN data centre in 2016

It is impossible to envisage high-energy physics without its foundation of microprocessor technology, software and distributed computing. Almost as soon as CERN was founded the first contract to provide a computer was signed, but it took manufacturer Ferranti more than two years to deliver “Mercury”, our first valve-based behemoth, in 1958. So early did this machine arrive that the venerable FORTRAN language had yet to be invented! A team of about 10 people was required for operations and the I/O system was already a bottleneck. It was not long before faster and more capable machines were available at the lab. By 1963, an IBM 7090 based on transistor technology was available with a FORTRAN compiler and tape storage. This machine could analyse 300,000 frames of spark-chamber data – a big early success. By the 1970s, computers were important enough that CERN hosted its first Computing and Data Handling School. It was clear that computers were here to stay.

By the time of the LEP era in the late 1980s, CERN hosted multiple large mainframes. Workstations, to be used by individuals or small teams, had become feasible. DEC VAX systems were a big step forward in power, reliability and usability and their operating system, VMS, is still talked of warmly by older colleagues in the field. Even more economical machines, personal computers (PCs), were also reaching a threshold of having enough computing power to be useful to physicists. Moore’s law, which predicted the doubling of transistor densities every two years, was well established and PCs were riding this technological wave. More transistors meant more capable computers, and every time transistors got smaller, clock speeds could be ramped up. It was a golden age where more advanced machines, running ever faster, gave us an exponential increase in computing power.

A simulated HL-LHC collision event

Key also to the computing revolution, alongside the hardware, was the growth of open-source software. The GNU project had produced many utilities that could be used by hackers and coders on which to base their own software. With the start of the Linux project to provide a kernel, humble PCs became increasingly capable machines for scientific com­puting. Around the same time, Tim Berners-Lee’s proposal for the World Wide Web, which began as a tool for connecting information for CERN scientists, started to take off. CERN realised the value in releasing the web as an open standard and in doing so enabled a success that today connects almost the entire planet.

LHC computing

This interconnected world was one of the cornerstones of the computing that was envisaged for the Large Hadron Collider (LHC). Mainframes were not enough, nor were local clusters. What the LHC needed was a worldwide system of interconnected computing systems: the Worldwide LHC Computing Grid (WLCG). Not only would information need to be transferred, but huge amounts of data and millions of computer jobs would need to be moved and executed, all with a reliability that would support the LHC’s physics programme. A large investment in brand new grid technologies was undertaken, and software engineers and physicists in the experiments had to develop, deploy and operate a new grid system utterly unlike anything that had gone before. Despite rapid progress in computing power, storage space and networking, it was extremely hard to make a reliable, working distributed system for particle physics out of these pieces. Yet we achieved this incredible task. During the past decade, thousands of physics results from the four LHC experiments, including the Higgs-boson discovery, were enabled by the billions of jobs executed and the petabytes of data shipped around the world.

The software that was developed to support the LHC is equally impressive. The community had made a wholesale migration from the LEP FORTRAN era to C++ and millions of lines of code were developed. Huge software efforts in every experiment produced frameworks that managed data taking and reconstruction of raw events to analysis data. In simulation, the Geant4 toolkit enabled the experiments to begin data-taking at the LHC with a fantastic level of understanding of the extraordinarily complex detectors, enabling commissioning to take place at a remarkable rate. The common ROOT foundational libraries and analysis environment allowed physicists to process the billions of events that the LHC supplied and extract the physics from them successfully at previously unheard of scales.

Changes in the wider world

While physicists were busy preparing for the LHC, the web became a pervasive part of people’s lives. Internet superpowers like Google, Amazon and Facebook grew up as the LHC was being readied and this changed the position of particle physics in the computing landscape. Where particle physics had once been a leading player in software and hardware, enjoying good terms and some generous discounts, we found ourselves increasingly dwarfed by these other players. Our data volumes, while the biggest in science, didn’t look so large next to Google; the processing power we needed, more than we had ever used before, was small beside Amazon; and our data centres, though growing, were easily outstripped by Facebook.

Graph showing the speedup of ALICE TPC tracking on GPUs

Technology, too, started to shift. Since around 2005, Moore’s law, while still largely holding, has no longer been accompanied by increases in CPU clock speeds. Programs that ran in a serial mode on a single CPU core therefore started to become constrained in their performance. Instead, performance gains would come from concurrent execution on multiple threads or from using vectorised maths, rather than from faster cores. Experiments adapted by executing more tasks in parallel – from simply running more jobs at the same time to adopting multi-process and multi-threaded processing models. This post hoc parallelism was often extremely difficult because the code and frameworks written for the LHC had assumed a serial execution model.

The barriers being discovered for CPUs also caused hardware engineers to rethink how to exploit CMOS technology for processors. The past decade has witnessed the rise of the graphics processing unit (GPU) as an alternative way to exploit transistors on silicon. GPUs run with a different execution model: much more of the silicon is devoted to floating-point calculations, and there are many more processing cores, but each core is smaller and less powerful than a CPU. To utilise such devices effectively, algorithms often have to be entirely rethought and data layouts have to be redesigned. Much of the convenient, but slow, abstraction power of C++ has to be given up in favour of more explicit code and simpler layouts. However, this rapid evolution poses other problems for the code long term. There is no single way to programme a GPU and vendors’ toolkits are usually quite specific to their hardware.

It is both a challenge and also an opportunity to work with new scientific partners in the era of exascale science

All of this would be less important were it the case that the LHC experiments were standing still, but nothing could be further from the truth. For Run 3 of the LHC, scheduled to start in 2021, the ALICE and LHCb collaborations are installing new detectors and preparing to take massively more data than they did up to now. Hardware triggers are being dropped in favour of full software processing systems and continuous data processing. The high-luminosity upgrade of the LHC for Run 4, from 2026, will be accompanied by new detector systems for ATLAS and CMS, much higher trigger rates and greatly increased event complexity. All of this physics needs to be supported by a radical evolution of software and computing systems, and in a more challenging sociological and technological environment. The LHC will also not be the only scientific big player in the future. Facilities such as DUNE, FAIR, SKA and LSST will come online and have to handle as much, if not more, data than at CERN and in the WLCG. That is both a challenge but also an opportunity to work with new scientific partners in the era of exascale science.

There is one solution that we know will not work: simply scaling up the money spent on software and computing. We will need to live with flat budgets, so if the event rate of an experiment increases by a factor of 10 then we have a budget per event that just shrank by the same amount! Recognising this, the HEP Software Foundation (HSF) was invited by the WLCG in 2016 to produce a roadmap for how to evolve software and computing in the 2020s – resulting in a community white paper supported by hundreds of experts in many institutions worldwide (CERN Courier April 2018 p38). In parallel, CERN open lab – a public–private partnership through which CERN collaborates with leading ICT companies and other research organisations – published a white paper setting out specific challenges that are ripe for tackling through collaborative R&D projects with leading commercial partners.

Facing the data onslaught

Since the white paper was published, the HSF and the LHC-experiment collaborations have worked hard to tackle the challenges it lays out. Understanding how event generators can be best configured to get good physics at minimum cost is a major focus, while efforts to get simulation speed-ups from classical fast techniques, as well as new machine-learning approaches, have intensified. Reconstruction algorithms have been reworked to take advantage of GPUs and accelerators, and are being seriously considered for Run 3 by CMS and LHCb (as ALICE makes even more use of GPUs since their successful deployment in Run 2). In the analysis domain, the core of ROOT is being reworked to be faster and also easier for analysts to work with. Much inspiration is taken from the Python ecosystem, using Jupyter notebooks and services like SWAN.

Graeme Stewart

These developments are firmly rooted in the new distributed models of software development based on GitHub or GitLab and with worldwide development communities, hackathons and social coding. Open source is also vital, and all of the LHC experiments have now opened up their software. In the computing domain there is intense R&D into improving data management and access, and the ATLAS-developed Rucio data management system is being adopted by a wide range of other HEP experiments and many astronomy communities. Many of these developments got a shot in the arm from the IRIS–HEP project in the US; other European initiatives, such as IRIS in the UK and the IDT-UM German project are helping, though much more remains to be done.

All this sets us on a good path for the future, but still, the problems remain significant, the implementation of solutions is difficult and the level of uncertainty is high. Looking back to the first computers at CERN and then imagining the same stretch of time into the future, predictions are next to impossible. Disruptive technology, like quantum computing, might even entirely revolutionise the field. However, if there is one thing that we can be sure of, it’s that the next decades of software and computing at CERN will very likely be as interesting and surprising as the ones already passed. 

bright-rec iop pub iop-science physcis connect