Comsol -leaderboard other pages

Topics

Chasing charged-lepton-flavour violation

Diagram of charged-lepton-flavour violation

Processes where the flavour of charged leptons is not conserved are undetectably rare in the Standard Model (SM). For neutral leptons, flavour violation is known to occur in neutrino oscillations, but charged-lepton-flavour violation (CLFV) is so suppressed that, if observed, it would provide indisputable evidence of physics beyond the SM.

The LHCb collaboration recently reported the results of searches for two CLFV decays, B+→ K+μ± e and B(s)0→ τ±μ, using 3 fb–1 of data collected in 2011 and 2012. The two decays provide complementary information as their final states involve charged leptons from different families, and both represent experimental challenges for LHCb. While the detector performance is excellent for muons, it is more difficult to reconstruct electrons and taus. The difficulty with electrons is related to energy losses via bremsstrahlung radiation. Meanwhile, the short-lived tau leptons are always reconstructed from their decay products, which include at least one neutrino, and thus part of the tau’s energy is unavoidably lost. In both cases, the analyses are able to recover some of the lost information and improve the resolution by exploiting constraints on the kinematics and topology of the decay.

Neither search found a signal (figure 1), but thanks to these reconstruction techniques and the large quantity of B-meson decays recorded by the detector, LHCb has established the most stringent upper limits on the branching fractions of these decays: 9.5 × 10–9 for B+→ K+ μ e+, 8.8 × 10–9 for B+→ K+ μ+ e, 1.4 × 10–5 for B0→ τ± μ, and 4.2 × 10–5 for Bs0→ τ± μ (all at the 95% confidence level). The latter is also the first ever limit on Bs0→ τ± μ.

Decays of B-mesons are particularly interesting in light of recent flavour anomalies

CLFV decays of B-mesons are particularly interesting in light of recent flavour anomalies, whereby LHCb found hints that the decay rates for b → sμ+μ and b → se+e are not equal (CERN Courier May/June 2019 p33). While the anomalies are most suggestive of the violation of lepton flavour universality, several proposed extensions to the SM that address them also predict CLFV, with branching ratios for B+→ K+ μ± e and B(s)0→ τ± μ, which are within LHCb’s reach. The latest LHCb results therefore impose strong new constraints on beyond-SM models. The analyses also open the door to further LHCb tests of CLFV by demonstrating the feasibility of searches for rare processes with final-state electrons and taus.

Between desert and swampland

Ferruccio Feruglio of INFN Padova

At the 22nd edition of the Planck conference series, which took place in Granada, Spain, from 3–7 June, 170 particle physicists and cosmologists discussed the latest in beyond the Standard Model (BSM) physics and ultraviolet completions of the SM within theories that unify the fundamental interactions.

Several speakers addressed the serious model-building restrictions in supersymmetry and Higgs compositeness that are imposed by the negative results of direct searches for BSM particles at ATLAS and CMS. Particular emphasis was put on the (extended) Higgs sector of the SM, where precision measurements might detect signals of BSM physics. Updates from LHCb and Belle on the flavour anomalies were also eagerly discussed, with proposed explanations including leptoquarks and additional U(1) gauge symmetries with exotic vector-like quarks. However, not all were convinced that the results signal BSM physics. On the cosmological side, delegates learned of the latest attempts to build models of WIMPs, axions, magnetic relics and dark radiation, which also include mechanisms for baryogenesis and inflation in the early universe.

Given the absence of new BSM particles so far at the LHC, theorists talk of a “desert” beyond the weak and Planck scales containing nothing but SM particles. Several speakers reported that phase transitions between non-trivial Higgs vacua could lead to violent phenomena in the early universe that might be tested by future gravitational-wave detectors. Within the inflationary universe these phenomena might also lead to the production of primordial black holes that could explain dark matter.

Discussions of ultraviolet (i.e. high-energy) completions of the SM encompassed the grand unification of fundamental interactions, the origin of neutrino masses, flavour symmetries and the so-called “swampland conjectures”, which characterise theories that might not be compatible with a consistent theory of quantum gravity. Therefore, one might hope that healthy signals of BSM physics might appear somewhere between the desert and the swampland.

Planck 2020 will be held from 8-12 June in Durham, UK.

LEP’s electroweak leap

Trailblazing events

In the early 1970s the term “Standard Model” did not yet exist – physicists used “Weinberg–Salam model” instead. But the discovery of the weak neutral current in Gargamelle at CERN in 1973, followed by the prediction and observation of particles composed of charm quarks at Brookhaven and SLAC, quickly shifted the focus of particle physicists from the strong to the electroweak interactions – a sector in which trailblazing theoretical work had quietly taken place in the previous years. Plans for an electron–positron collider at CERN were soon born, with the machine first named LEP (Large Electron Positron collider) in a 1976 CERN yellow report authored by a distinguished study group featuring, among others, John Ellis, Burt Richter, Carlo Rubbia and Jack Steinberger.

LEP’s size – four times larger than anything before it – was chosen from the need to observe W-pair production, and to check that its cross section did not diverge as a function of energy. The phenomenology of the Z-boson’s decay was to come under similar scrutiny. At the time, the number of fermion families was undefined, and it was even possible that there were so many neutrino families that the Z lineshape would be washed out. LEP’s other physics targets included the possibility of producing Higgs bosons. At the time, the mass of the Higgs boson was completely unknown and could have been anywhere from around zero to 1 TeV.

The CERN Council approved LEP in October 1981 for centre-of-mass energies up to 110 GeV. It was a remarkable vote of confidence in the Standard Model (SM), given that the W and Z bosons had not yet been directly observed. A frantic period followed, with the ALEPH, DELPHI, L3 and OPAL detectors approved in 1983. Based on similar geometric principles, they included drift chambers or TPCs for the main trackers, BGO crystals, lead–glass or lead–gas sandwich electromagnetic calorimeters, and, in most cases, an instrumented return yoke for hadron calorimetry and muon filtering. The underground caverns were finished in 1988 and the detectors were in various stages of installation by the end of spring 1989, by which time the storage ring had been installed in the 27 km-circumference tunnel (see The greatest lepton collider).

Expedition to the Z pole

The first destination was the Z pole at an energy of around 90 GeV. Its location was then known to ±300 MeV from measurements of proton–antiproton collisions at Fermilab’s Tevatron. The priority was to establish the number of light neutrino families, a number that not only closely relates to the number of elementary fermions but also impacts the chemical composition and large-scale structure of the universe. By 1989 the existence of the νe, νμ and ντ neutrinos was well established. Several model-dependent measurements from astrophysics and collider physics at the time had pointed to the number of light active neutrinos (Nν) being less than five, but the SM could, in principle, accommodate any higher number.

The OPAL logbook entry for the first Z boson seen at LEP

The initial plan to measure Nν using the total width of the Z resonance was quickly discarded in favour of the visible peak cross section, where the effect was far more prominent – and in first approximation, insensitive to new possible detectable channels. The LEP experiments were therefore thrown in at the deep end, needing to make an absolute cross-section measurement with completely new detectors in an unfamiliar environment that demanded triggers, tracking, calorimetry and the luminosity monitors to all work and acquire data in synchronisation.

On the evening of 13 August, during a first low-luminosity pilot run just one month after LEP achieved first turns, OPAL reported the first observation of a Z decay (see OPAL fruits). Each experiment quickly observed a handful more. The first Z-production run took place from 18 September to 9 October, with the four experiments accumulating about 3000 visible Z decays each. They took data at the Z peak and at 1 and 2 GeV either side, improving the precision on the Z mass and allowing a measurement of the peak cross section. The results, including those from the Mark II collaboration at SLAC’s linear electron–positron SLC collider, were published and presented in CERN’s overflowing main auditorium on 13 October.

After only three weeks of data taking and 10,000 Z decays, the number of neutrinos was found to be three. In the following years, some 17 million Z decays were accumulated, and cross-section measurement uncertainties fell to the per-mille level. And while the final LEP number – Nν = 2.9840 ± 0.0082 – may appear to be a needlessly precise measurement of the number three (figure 1a), it today serves as by far the best high-energy constraint on the unitarity of the neutrino mixing matrix. LEP’s stash of a million clean tau pairs from Z → τ+ τ– decays also allowed the universality of the lepton–neutrino couplings to the weak charged current to be tested with unprecedented precision. The present averages are still dominated by the LEP numbers: gτ/gμ = 1.0010 ± 0.0015 and gτ/ge = 1.0029 ± 0.0015.

Diagrams showing measurements at LEP

LEP continued to carry out Z-lineshape scans until 1991, and repeated them in 1993 and 1995. Two thirds of the total luminosity was recorded at the Z pole. As statistical uncertainties on the Z’s parameters went down, the experiments were challenged to control systematic uncertainties, especially in the experimental acceptance and luminosity. Monte Carlo modelling of fragmentation and hadronisation was gradually improved by tuning to measurements in data. On the luminosity front it soon became clear that dedicated monitors would be needed to measure small-angle Bhabha scattering (e+e e+e), which proceeds at a much higher rate than Z production. The trick was to design a compact electromagnetic calorimeter with sufficient position resolution to define the geometric acceptance, and to compare this to calculations of the Bhabha cross section.

The final ingredient for LEP’s extraordinary precision was a detailed knowledge of the beam energy, which required the four experiments to work closely with accelerator experts. Curiously, the first energy calibration was performed in 1990 by circulating protons in the LEP ring – the first protons to orbit in what would eventually become the LHC tunnel, but at a meagre energy of 20 GeV. The speed of the protons was inferred by comparing the radio-frequency electric field needed to keep protons and electrons circulating at 20 GeV on the same orbit, allowing a measurement of the total magnetic bending field on which the beam energy depends. This gave a 20 MeV uncertainty on the Z mass. To reduce this to 1.7 MeV for the final Z-pole measurement, however, required the use of resonant depolarisation routinely during data taking. First achieved in 1991, this technique uses the natural transverse spin polarisation of the beams to yield an instantaneous measurement of the beam energy to a precision of ±0.1 MeV – so precise that it revealed minute effects caused, for example, by Earth’s tides and the passage of local trains (see Tidal forces, melting ice and the TGV to Paris). The final precision was more than 10 times better than had been anticipated in pre-LEP studies.

Electroweak working group

The LEP electroweak working group saw the ALEPH, DELPHI, L3 and OPAL collaborations work closely on combined cross-section and other key measurements – in particular the forward-backward asymmetry in lepton and b-quark production – at each energy point. By 1994, results from the SLD collaboration at SLAC were also included. Detailed negotiations were sometimes needed to agree on a common treatment of statistical correlations and systematic uncertainties, setting a precedent for future inter-experiment cooperation. Many tests of the SM were performed, including tests of lepton universality (figure 1b), adding to the tau lepton results already mentioned. Analyses also demonstrated that the couplings of leptons and quarks are consistent with the SM predictions.

The combined electroweak measurements were used to make stunning predictions of the top-quark and Higgs-boson masses, mt and mH. After the 1993 Z-pole scan, the LEP experiments were able to produce a combined measurement of the Z width with a precision of 3 MeV in time for the 1994 winter conferences, allowing the prediction mt = 177 ± 13 ± 19 GeV where the first error is experimental and the second is due to mH not being known. A month later the CDF collaboration at the Tevatron announced the possible existence of a top quark with a mass of 176 ± 16 GeV. Both CDF and its companion experiment D0 reached 5σ “discovery” significance a year later. It is a measure of the complexity of the Z-boson analyses (in particular the beam-energy measurement) that the final Z-pole results were published a full 11 years later, constraining the Higgs mass to be less than 285 GeV at 95% confidence level (figure 1c), with a best fit at 129 GeV.

From QCD to the W boson

LEP’s fame in the field tends to concern its electroweak breakthroughs. But, with several million recorded hadronic Z decays, the LEP experiments also made big advances in quantum chromodynamics (QCD). These results significantly increased knowledge of hadron production and quark and gluon dynamics, and drove theoretical and experimental methods that are still used extensively today. LEP’s advantage as a lepton collider was to have an initial state that was independent of nucleon structure functions, allowing the measurement of a single, energy-scale-dependent coupling constant. The strong coupling constant αs was determined to be 0.1195 ± 0.0034 at the Z pole, and to vary with energy – the highlight of LEP’s QCD measurements. This so-called running of αs was verified over a large energy range, from the tau mass up to 206 GeV, yielding additional experimental confirmation of QCD’s core property of asymptotic freedom (figure 2a).

Diagrams showing LEP results

Many other important QCD measurements were performed, such as the gluon self-coupling, studies of differences between quark and gluon jets, verification of the running b-quark mass, studies of hadronisation models, measurements of Bose–Einstein correlations and detailed studies of hadronic systems in two-photon scattering processes. The full set of measurements established QCD as a consistent theory that accurately describes the phenomenology of the strong interaction.

Following successful Z operations during the “LEP1” phase in 1989–1995, a second LEP era devoted to accurate studies of W-boson pair production at centre-of-mass energies above 160 GeV got under way. Away from the Z resonance, the electron-positron annihilation cross section decreases sharply; as soon as the centre-of-mass energy reaches twice the W and Z boson masses, the WW, then ZZ, production diagrams open up (figure 2b). Accessing the WW threshold required the development of superconducting radio-frequency cavities, the first of which were already installed in 1994, and they enabled a gradual increase in the centre-of-mass energy up to a maximum of 209 GeV in 2000.

The “LEP2” phase allowed the experiments to perform a signature analysis, which dated back to the first conception of the machine: the measurement of the WW-boson cross section. Would it diverge or would electroweak diagrams interfere to suppress it? The precise measurement of the WW cross section as a function of the centre-of-mass energy was a very important test of the SM since it showed that the sum and interference of three four-fermion processes were indeed acting in the WW production: the t-channel ν exchange, and the s-channel γ and Z exchange (figure 2c). LEP data proved that the γWW and ZWW triple gauge vertexes are indeed present and interfere destructively with the t-channel diagram, suppressing the cross section and stopping it from diverging.

The second key LEP2 electroweak measurement was of the mass and total decay width of the W boson, which were determined by directly reconstructing the decay products of the two W bosons in the fully hadronic (W+W qqqq) and semi-leptonic (W+W qqℓν) decay channels. The combined LEP W-mass measurement from direct reconstruction data alone is 80.375 ± 0.025(stat) ± 0.022(syst) GeV, the largest contribution to the systematic uncertainties originating from fragmentation and hadronisation uncertainties. The relation between the Z-pole observables, mt and mW, provides a stringent test of the SM and constrains the Higgs mass.

To the Higgs and beyond

Before LEP started, the mass of the Higgs boson was basically unknown. In the simplest version of the SM, involving a single Higgs boson, the only robust constraints were its non-observation in nuclear decays (forbidding masses below 14 MeV) and the need to maintain a sensible, calculable theory (ruling out masses above 1 TeV). In 1990, soon after the first LEP data-taking period, the full Higgs-boson mass range below 24 GeV was excluded at 95% confidence level by the LEP experiments. Above this mass the main decay of the Higgs boson, occurring 80% of the time, was predicted to be its decays into b quark–antiquark pairs, followed by pairs of tau leptons, charm quarks or gluons, while the WW* decay mode starts to contribute at the maximum reachable masses of approximately 115 GeV. The main production process is Higgs-strahlung, whereby a Higgs is emitted by a virtual Z boson.

The combined electroweak measurements were used to make stunning predictions of the top quark and Higgs boson masses

During the full lifetime of LEP, the four experiments kept searching for neutral and charged Higgs bosons in several models and exclusion limits continued to improve. In its last year of data taking, when the centre-of-mass energy reached 209 GeV, ALEPH reported an excess of four-jet events. It was consistent with a 114 GeV Higgs boson and had a significance that varied as the data were accumulated, peaking at an instantaneous significance of around 3.9 standard deviations. The other three experiments carefully scrutinised their data to confirm or disprove ALEPH’s suggestion, but none observed any long-lasting excess in that mass region. Following many discussions, the LEP run was extended until 8 November 2000. However, it was decided not to keep running the following year so as not to impact the LHC schedule. The final LEP-wide combination excluded, at 95% confidence level, a SM Higgs boson with mass below 114.4 GeV.

The four LEP experiments carried out many other searches for novel physics that set limits on the existence of new particles. Notable cases are the searches for additional Higgs bosons in two-Higgs-doublet models and their minimal supersymmetric incarnation. Neutral scalar and pseudoscalar Higgs bosons lighter than the Z boson and charged Higgs bosons up to the kinematic limit of their pair production were also excluded. Supersymmetric particles suffered a similar fate, in the theoretically attractive assumption of R-parity conservation. The existence of sleptons and charginos was excluded in the largest part of the parameter space for masses below 70–100 GeV, near the kinematic limit for their pair production. Neutralinos with masses below approximately half the Z-boson mass were also excluded in a large part of the parameter space. The LEP exclusions for several of these electroweak-produced supersymmetric particles are still the most stringent and most model-independent limits ever obtained.

It is very hard to remember how little we knew before LEP and the giant step that LEP made. It was often said that LEP discovered electroweak radiative corrections at the level of 5σ, opening up a precision era in particle physics that continues to set the standard today and offer guidance on the elusive new physics beyond the SM.

The greatest lepton collider

A quadrupole next to one of the long dipole magnets in LEP

A few minutes before midnight on a summer’s evening in July 1989, 30 or so people were crammed into a back room at CERN’s Prévessin site in the French countryside. After years of painstaking design and construction, we were charged with breathing life into the largest particle accelerator ever built. The ring was complete, the aperture finally clear and the positron beam made a full turn on our first attempt. Minutes later beams were circulating, and a month later the first Z boson event was observed. Here began a remarkable journey that firmly established the still indefatigable Standard Model of particle physics.

So, what can go wrong when you’re operating 27 kilometres of particle accelerator, with ultra-relativistic leptons whizzing around the ring 11,250 times a second? The list is long. The LEP ring was packed with magnets, power converters, a vacuum system, a control system, a cryogenics system, a cooling and ventilation system, beam instrumentation – and much more. Then there was the control system, fibres, networks, routers, gateways, software, databases, separators, kickers, beam dump, radio-frequency (RF) cavities, klystrons, high-voltage systems, interlocks, synchronisation, timing, feedback… And, of course, the experiments, the experimenters and everybody’s ability to get along in a high-pressure environment.

LEP wasn’t the only game in town. There was fierce competition from the more innovative Stanford Linear Collider (SLC) in California. But LEP was off to a fantastic start and its luminosity increase was much faster than at its relatively untested linear counterpart. A short article capturing the transatlantic rivalry appeared in the Economist on 19 August 1989. “The results from California are impressive,” the magazine reported, “especially as they come from a new and unique type of machine. They may provide a sure answer to the generation problem before LEP does. This explains the haste with which the finishing touches have been applied to LEP. The 27 km-long device, six years in the making, was transformed from inert hardware to working machine in just four weeks – a prodigious feat, unthinkable anywhere but at CERN. Even so, it was still not as quick as Carlo Rubbia, CERN’s domineering director-general might have liked.”

Notes from the underground

LEP’s design dates from the late 1970s, the project being led by accelerator-theory group leader Eberhard Keil, RF group leader Wolfgang Schnell and C J “Kees” Zilverschoon. The first decision to be made was the circumference of the tunnel, with four options on the table: a 30 km ring that went deep into the Jura mountains, a 22 km ring that avoided them entirely, and two variants with a length of 26.7 km that grazed the outskirts of the mountains. Then director-general Herwig Schopper decided on a circumference of 26.7 km with an eye on a future proton collider for which it would be “decisive to have as large a tunnel as possible” (CERN Courier July/August 2019 p39). The final design was approved on 30 October 1981 with Emilio Picasso leading the project. Construction of the tunnel started in 1983, after a standard public enquiry in France.

Blasting the LEP tunnel under the Jura mountains

LEP’s tunnel, the longest-ever attempted prior to the Channel Tunnel, which links France and Britain, was carved by three tunnel-boring machines. Disaster struck just two kilometres into the three-kilometre stretch of tunnel in the foothills of the Jura, where the rock had to be blasted because it was not suitable for boring. Water burst in and formed an underground river that took six months to eliminate (figure 1). By June 1987, however, part of the tunnel was complete and ready for the accelerator to be installed.

Just five months after the difficult excavation under the Jura, one eighth of the accelerator (octant 8) had been completely installed, and, a few minutes before midnight on 12 July 1988, four bunches of positrons made the first successful journey from the town of Meyrin in Switzerland (point 1) to the village of Sergy in France (point 2), a distance of 2.5 km. Crucially, the “octant test” revealed a significant betatron coupling between the transverse planes: a thin magnetised nickel layer inside the vacuum chambers was causing interference between the horizontal and vertical focusing of the beams. The quadrupole magnets were adjusted to prevent a resonant reinforcement of the effect each turn, and the nickel was eventually demagnetised.

Giving birth to LEP

The following months saw a huge effort to install equipment in the remaining 24 km of the tunnel – magnets, vacuum chambers and RF cavities, as well as beam instrumentation, injection equipment, electrostatic separators, electrical cabling, water cooling, ventilation and all the rest. This was followed by conditioning the cavities, baking out and leak-testing the vacuum chambers, and individual testing. At the same time a great deal of effort to prepare the software needed to operate the collider was made with limited resources.

In the late 1980s, control systems for accelerators were going through a major transition to the PC. LEP was caught up in the mess and there were many differences of opinion on how to design LEP’s control system. As July 1989 approached, the control system was not ready and a small team was recruited to implement the bare minimum controls required to inject beam and ramp up the energy. Unable to hone key parameters such as the tune and orbit corrections before beam was injected, we had two major concerns: is the beam aperture clear of all obstacles, and are there any polarity errors in the connections of the many thousand magnetic elements? So we nominated a “Mr Polarity”, whose job was to check all polarities in the ring. This may sound trivial, but with thousands of connections it was a huge task.

Tidal forces, melting ice and the TGV to Paris

Diagrams showing variations in LEP’s beam-energy resolution

LEP’s beam-energy resolution was so precise that it was possible to observe distortion of the 27 km ring by a single millimetre, whether due to the tidal forces of the Sun and Moon, or the seasonal distortion caused by rain and meltwater from the nearby mountains filling up Lac Léman and weighing down one side of the ring. In 1993 we noticed even more peculiar random variations on the energy signal during the day – with the exception of a few hours in the middle of the night when the signal was noise free. Everybody had their own pet theory. I believed it was some sort of effect coming from planes interacting with the electrical supply cables. Some nights later I could be seen sitting in a car park on the Jura at 2 a.m., trying to prove my theory with visual observations, but it was very dark and all the planes had stopped landing several hours beforehand. Experiment inconclusive! The real culprit, the TGV (a high-speed train), was discovered by accident a few weeks later during a discussion with a railway engineer: leakage currents on the French rail track flowed through the LEP vacuum chamber with the return path via the Versoix river back to Cornavin. The noise hadn’t been evident when we first measured the beam energy as TGV workers had been on strike.

At a quarter to midnight on 14 July 1989, the aperture was free of obstacles and the beam made its first turn on our first attempt. Soon afterwards we managed to achieve a circulating beam, and we were ready to fine tune the multitude of parameters needed to prepare the beams for physics.

The goal for the first phase of LEP was electron–positron collisions at a total energy of 91 GeV – the mass of the neutral carrier of the weak force, the Z boson. LEP was to be a true Z factory, delivering millions of Zs for precision tests of the Standard Model. To mass-produce them required beams not only of high energy but also of high intensity, and delivering them required four steps. The first was to accumulate the highest possible beam current at 20 GeV – the injection energy. This was a major operation in itself, involving LEP’s purpose-built injection linac and electron–positron accumulator, the Proton Synchrotron, the Super Proton Synchrotron (SPS) and, finally, transfer lines to inject electrons and positrons in opposite directions – these curved not only horizontally but also vertically as LEP and the SPS were at different heights. The second step was to ramp up the accumulated current to the energy of the Z resonance with minimal losses. Thirdly, the beam had to be “squeezed” to improve the collision rate at the interaction regions by changing the focusing of the quadrupoles on either side of the experiments, thereby reducing the transverse cross section of the beam at the collision points.

Following the highly successful first turn on 14 July 1989, we spent the next month preparing for the first physics run. Exactly a month later, on 13 August, the beams collided for the first time. The following 10 minutes seemed like an eternity since none of the four experiments – ALEPH, DELPHI, L3 and OPAL – reported any events. I was in the control room with Emilio Picasso and we were beginning to doubt that the beams were actually colliding when Aldo Michelini called from OPAL with the long-awaited comment: “We have the first Z0!” ALEPH and OPAL physicists had connected the Z signal to a bell that sounded on the arrival of the particle in their detectors. While OPAL’s bell rang proudly, ALEPH’s was silent, leading to a barrage of complaints before it became apparent that they were waiting for the collimators to close before turning on their sub detectors. As the luminosity rose during the subsequent period of machine studies the bells became extremely annoying and were switched off.

From the Z pole to the WW threshold

Physicists in front of the final superconducting RF-cavity module to be installed

The first physics run began on 20 September 1989, with LEP’s total energy tuned for five days to the Z mass peak at 91 GeV, providing enough integrated luminosity to generate 1400 Zs in each experiment. A second period followed, this time with the energy scanned through the width of the Z at five different beam energies: at the peak and ±1 GeV and ±2 GeV to either side, allowing the experiments to measure the width of the Z resonance. First physics results were announced on 13 October, just three months after the final testing of the accelerator’s components (see LEP’s electroweak leap).

LEP dwelt at the Z peak from 1989 to 1995, during which time the four experiments each observed approximately 4.5 million Z decays. In 1995 a major upgrade dubbed LEP2 saw the installation of 288 superconducting cavities (figure 2), enabling LEP to sit at or near the WW threshold of 161 GeV for the following five years. The maximum beam energy reached was 104.4 GeV. There was also a continuous effort to increase the luminosity by increasing the number of bunches, reducing the emittance by adjusting the focusing, and squeezing the bunches more tightly at the interaction points, with LEP’s performance ultimately limited by the nonlinear forces of the beam–beam interaction – the perturbations of the beams as they cross the opposing beam. LEP surpassed every one of its design parameters (figure 3).

Life as a LEP accelerator physicist

Being an accelerator physicist at LEP took heart as well as brains. The sisyphean daily task of coaxing the seemingly temperamental machine to optimal performance even led us to develop an emotional attachment to it. Challenges were unpredictable, such as for the engineers dispatched on a fact-finding mission to ascertain the cause of an electrical short circuit, only to discover two deer, “Romeo and Juliet”, locked in a lover’s embrace having bitten through a cable, or the discovery of sabotage with beer bottles (see The bizarre episode of the bottles in the beampipe). The aim, however, was clear: inject as much current as possible into both beams, ramp the energy up to 45 GeV, squeeze the beam size down at the collision points, collide and then spend a few hours delivering events to the experiments. The reality was hours of furious concentration, optimisation, and, in the early days, frustrating disappointment.

Diagram showing LEP’s integrated luminosity

In the early years, filling LEP was a delicate hour-long process of parameter adjustment, tweaking and coaxing the beam into the machine. On a good day we would see the beam wobble alarmingly on the UV telescopes, lose a bit and watch the rest struggle up the ramp. On a bad day, futile attempt after futile attempt, most of the beam would disappear without warning in the first few seconds of the ramp. The process used to last minutes and there was nothing you could do. We would stand there, watching the lifetime buck and dip, and the painstakingly injected beam would either slowly or quickly drift out of the machine. The price of failure was a turn around and refill. Success brought the opportunity to chance the squeeze – an equally hazardous manoeuvre whereby the interaction-point focusing magnets were adjusted to reduce the beam size – and then perhaps a physics fill, and a period of relative calm. At this stage the focus would move to the experimental particle physicists on shift at the four experiments. Each had their own particular collective character, and their own way of dealing with us. We verged between being accommodating, belligerent, maverick, dedicated, professional and very occasionally hopelessly amateur – sometimes all within the span of a single shift, depending on the attendant pressures.

Table showing LEP

The experiment teams paraded their operational efficiency numbers – plus complaints or congratulations – at twice weekly scheduling meetings. Well run and disciplined, ALEPH almost always had the highest efficiency figures; their appearances at scheduling meetings nearly always a simple statement of 97.8% or thereabouts. This was livened in later years by the repeated appearance of their coordinator Bolek Pietrzyk, who congratulated us each time we stepped up in energy or luminosity with a strong, Polish-accented, “Congratulations! You have achieved the highest energy electron–positron collisions in the universe!”, which was always gratifying. Equally professional, but more relaxed, was OPAL, which had a strong British and German contingent. These guys understood human nature. Quite simply, they bribed us. Every time we passed a luminosity target or hit a new energy record they’d turn up in the control room with champagne or crates of German beer. Naturally we’d do anything for them, happily moving heaven and earth to resolve their problems. L3 and DELPHI had their own quirks. DELPHI, for example, ran their detector as a “state machine”, whose status changed automatically based on signals from the accelerator control room. All well and good, but they depended on us to change the mode to “dump beam” at the end of a fill, something that was occasionally skipped, leaving DELPHI’s subdetectors on and them ringing us desperately for a mode change. Baffled DELPHI students on shift would ask what was going on. Filling and ramping were demanding periods during the operational sequence and a lot of concentration was required. The experiment teams did well not to ring and make too many demands at this stage – requests were occasionally rebuffed with a brusque response.

On the verge of a great discovery?

LEP’s days were never fated to dwindle. Early on, CERN had a plan to install the LHC in the same tunnel, in a bid to scan ever higher energies and be the first to discover the Higgs boson. However, on 14 June 2000, LEP’s final year of scheduled running, the ALEPH experiment reported a possible Higgs event during operations at a centre-of-mass energy of 206.7 GeV. It was consistent with “Higgs-strahlung”, whereby a Z radiates a Higgs boson, which was expected to dominate Higgs-boson production in e+e collisions at LEP2 energies. On 31 July and 21 August ALEPH reported second and third events corresponding to a putative reconstructed Higgs mass in the range 114–115 GeV.

The bizarre episode of the bottles in the beampipe

The bottles in the beampipe

The story of the sabotage of LEP has grown in the retelling, but I was there in June 1996, hurrying back early from a conference to help the machine operators, who had been struggling to circulate a beam for several days. After exhausting other possibilities, it became clear that there was an obstruction in the vacuum pipe, and we detected the location using the beam position system. It appeared to be around point 1 (where ATLAS now sits), so we opened the vacuum seal and took a look inside the beampipe using mirrors and endoscopes. Not seeing anything, I frustratedly squeezed my head between the vacuum flanges and peered down inside the pipe. In the distance was something resembling a green concave lens. “This looks like the bottom of a beer bottle,” I thought, restraining myself from uttering a word to anyone in the vicinity. I went to the opposite open end of the vacuum section and peered into the vacuum pipe again: a green circular disk this time, but again, not a word. Someone got a long pole to poke out the offending article – out it came, and my guess was correct: it was a Heineken beer bottle, which had indeed refreshed the parts no other beer could reach, as the slogan ran. A hasty search revealed a second bottle. Upon closer inspection it was clear that the control room operators had almost succeeded in making the beam circulate despite the obstacles: there was a scorch burn along the label, indicating that they had almost managed to steer the beam past the bottles. If there had only been one they may have succeeded. The Swiss police interviewed me concerning this act of sabotage  but the culprit was never unmasked.

LEP was scheduled to stop in mid-September with two weeks of reserve time granted to the LEP experiments to see if new Higgs-like events would appear. After the reserve weeks, ALEPH requested two months more running to double its integrated luminosity. One was granted, yielding a 50% increase in the accumulated data, and ALEPH presented an update of their results on 10 October: the signal excess had increased to 2.6σ. Things were really heating up, and on 16 October L3 announced a missing-energy candidate. By now the accelerator team was pushing LEP to its limits, to squeeze out every ounce of physics data in the service of the experiments’ search for the elusive Higgs. At the LEP committee meeting on 3 November, ALEPH presented new data that confirmed their excess once again – it had now grown to 2.9σ. A request to extend LEP running by one year was made to the LEPC. There was gridlock, and no unanimous recommendation could be made.

All of CERN was discussing the proposed running of LEP in 2001 to get final evidence of a possible discovery of the Higgs boson. Arguments against included delays to the start of the LHC of up to three years. There was also concern that Fermilab’s Tevatron would beat the LHC to the discovery of the Higgs, and mundane but practical arguments about the transfer of human resources to the LHC and the impact on the materials budget, including electricity costs. The impending closure of LEP, when many of us thought we were about to discover the Higgs, was perceived like the death of a dear friend by most of the LEP-ers. After each of the public debates on the subject a group of us would meet in some local pub, drink a few beers, curse the disbelievers and cry on each other’s shoulders. This was the only “civil war” that I saw in my 43 years at CERN.

LEP’s final moments before being decommissioned and replaced by the LHC

The CERN research board met again on 7 November and again there was deadlock, with the vote split eight votes to eight. The next day, then director-general Luciano Maiani announced that LEP had closed for the last time. It was a deeply unpopular decision, but history has shown it to be correct: the Higgs was discovered at the LHC 12 years later, with a mass of not 115 but 125 GeV. LEP’s closure allowed a massive redeployment of skilled staff, and the experience gained for the first time in running large accelerators went on to prove essential to the safe and efficient operation of the LHC.

When LEP was finally laid to rest we met one last time for an official wake (figure 4). After the machine was dismantled, requiring the removal to the surface of around 30,000 tonnes of material, some of the magnets and RF units were shipped to other labs for use in new projects. Today, LEP’s concrete magnet casings can still be seen scattered around CERN as shielding units for antimatter and fixed-target experiments, and even as road barriers.

LEP was the highest energy e+e collider ever built. Its legacy was and is extremely important for present and future colliders. The quality and precision of the physics data remain unsurpassed in luminosity, energy and energy calibration. It is the reference for any future e+e-ring collider design.

Tunnelling for physics

New service tunnels

In 2012 the CERN management asked a question: what is the largest circular machine that could be feasibly constructed in the Geneva region from a civil-engineering perspective? Teams quickly embarked on an extensive investigation of the geological, environmental and technical constraints in pursuit of the world’s largest accelerator. Such a machine would be the next logical step in exploring the universe at ever smaller scales.

Since construction of the 27 km circumference Large Hadron Collider (LHC) was completed in 2005, CERN has been looking at the potential layouts for the tunnels that will house the next generation of particle accelerators. The Compact Linear Collider (CLIC) and the Future Circular Collider (FCC) are the two largest projects under consideration. With a circumference of 100 km, the FCC will require one of the world’s largest tunnels – almost twice as long as the recently completed 57 km Gotthard Base Tunnel in the Swiss Alps. Designing large infrastructure like the FCC tunnel requires the collection and interpretation of numerous data, which have to be balanced for the optimum level of risk, cost and project requirements.

The first and most important task in designing tunnels is to understand the needs and requirements of the users. For road or rail tunnels, this is relatively straightforward. For a cutting-edge scientific experiment, multi-disciplinary working groups are needed to identify the key criteria. The diameter of a new tunnel depends on what components would be inside – ventilation systems, magnets, lighting, transport corridors, etc – so they can fit in like a jigsaw.

Bespoke designs

Unlike other tunnelling projects, there are no standard rules or guidance for the design of particle-accelerator tunnels, meaning each design is, to a large extent, bespoke. One reason for this is the sensitivity of the equipment inside. Digging a 5.6 m-diameter hole disturbs rock that has been there for millennia, causing it to relax and to move. Modern tunnelling techniques can control these movements and get a tunnel to within a few centimetres of its intended design. For example, the two ends of the 27 km LEP ring came together with just 1 cm of error. It would be impossible to achieve the nanometre-level tolerances that the beamline requires, so the sensitive equipment installed in a completed accelerator tunnel must incorporate adjustable alignment systems into their designs.

The scale of the proposed CLIC and FCC projects

The city of Geneva sits on a large plateau between the Jura and Prealps mountains. The bedrock of the plateau is a competent (resistant to deformation) sedimentary rock, called molasse, which formed when eroded material was deposited and consolidated in a basin as the Alps lifted up. On top of the molasse sits a softer soil, called the moraines, which is made up of more recent, unconsolidated glacial deposits. The Jura itself is made of limestone rock, which while competent, is soluble and can form a network of underground voids, known as karsts.

We can never fully understand the ground before we start tunnelling and there is always the risk of encountering something unexpected, such as water, faults or obstructions. These cost money to overcome and/or delay the project; in the worst cases, they may even cause the tunnel to collapse. To help mitigate these risks and provide technical information for the tunnel design, we investigate the ground in the early stages of the project by drilling boreholes and testing ground samples. Like most things in civil engineering, however, there is a balance between the cost of the investigations versus the risks they mitigate. No boreholes have been sunk specifically for FCC yet, but we have access to a substantial amount of data from the LHC and from the Swiss and French authorities.

The answer to CERN’s question in 2012 was that a (quasi-)circular tunnel up to 100 km long could be built near Geneva (figure 1). This will be confirmed with further site investigations to verify the design assumptions and optimise a layout for the new machine. The FCC study considers two potential high-energy accelerators: hadron–hadron and electron–positron, and the FCC would consist of a series of arcs and straight sections (figure 2). Depending on the choice of a future collider, civil-engineering designs for FCC and/or CLIC will need to be developed further. Although the challenges between the two studies differ, the processes and tools used will be similar.

Optimising the alignment

Having determined the FCC’s feasibility, CERN’s civil engineers started designing the optimal route of the tunnel. Geology and topography are the key constraints on the tunnel position. Two alignment options were under consideration in 2012, both 80 km long, one located under the Jura Mountains and the other in the Geneva basin. When the FCC study officially kicked off in 2014, they were reviewed alongside a 47 km-circumference option fully excavated in the molasse.

Diagram of the FCC

Experience of tunnelling through Jura limestone during construction of the Large Electron Positron collider (LEP; from which the LHC inherited many of its tunnels) convinced civil engineers to discard the Jura option. Mining through the karstic limestone caused several delays and costly repairs after water and sediment flowed into the tunnel (see The greatest lepton collider). To this day, intensive maintenance works are needed between sectors 3 and 4 of the LHC tunnel and this has led to machine shutdowns lasting as long as two weeks.

By 2016, the proposed length of the FCC had increased to between 80 and 100 km to achieve higher energies with two alignments under consideration: intersecting (which crosses the LHC in plan view) and non-intersecting. The former is the current baseline design. The tunnel is located primarily in the competent molasse rock and avoids the problematic Jura limestone and the Prealps. However, it does pass through the Mandallaz limestone formation and also has to cross under Lake Geneva. To deal with the wealth of topographical, geological and environmental data relevant for a 100 km ring, CERN embarked on an innovative tunnel optimisation tool (TOT) that would let us assess a multitude of alignment options in a fraction of the time (see CERN’s tunnel optimisation tool).

CERN’s tunnel optimisation tool

The tunnel optimisation tool

In 2014, with the help of UK-based engineering consultancy Arup, CERN developed the tunnel optimisation tool (TOT) to integrate project requirements and data into a geospatial model.The web-based tool allows the user to digitally move the FCC tunnel, change its size, shape and depth and see, in real-time, the impacts of the changes on the design. Geology, surface constraints and environmentally protected areas are visualised, and parameters such as plane inclinations and tunnel depth can be changed at the click of a mouse. The tool warns users if certain limits are exceeded or obstacles are encountered, for example, if a shaft is in the middle of Lake Geneva! When it was built, TOT was the first of its kind within the industry. It has cut the cost of the civil-engineering design and has provided us with the flexibility to meet changing requirements to ultimately deliver a better project. The success of TOT led to its replication for CLIC and the International Linear Collider (ILC) under consideration in Japan. Recently, a TOT was built by Arup to quickly and cheaply assess a range of alignments for a 3 km tunnel under the ancient Stonehenge heritage site in the UK.

The alignment of the FCC tunnel has been optimised based on three key criteria at this stage: geology (building in the competent molasse rock wherever possible); shaft depth (minimising the depth of shafts); and surface sites (choosing locations that minimise disruption to residents and the environment).

Despite the best efforts to avoid the risky Jura Mountains, the geology is not perfect. The Prealps region has complex, faulted geology and it is uncertain which layers the tunnel will cross. Cracks or faults, caused by tectonic movements of the Alps and Jura, can occur in the molasse and limestone. Excavation through Mandallaz limestone can lead to similar issues encountered during LEP’s construction. Large, high-pressure inflows can be difficult to remedy, expensive and can create delays in the programme.

Options for tunnelling under Lake Geneva

To minimise the depth of the shafts, the entire FCC ring sits in an inclined plane with different heights above sea level around the tunnel. Modelling a range of alignment options at different locations and with different tunnel inclinations, constrained by the spacing requirements of the experiments, it turned out that one shaft was 558 m deep in the baseline design. The team therefore decided to replace the vertical shaft with an inclined tunnel (15% slope) to pop out the side of the mountain.

The presence of Lake Geneva influences the overall depth of the FCC, and the tunnel optimisation tool tells us that it isn’t possible to avoid tunnelling under the lake within the study boundary. Modern tunnelling techniques open up different options for crossing the lake, instead of simply digging deeper until we reach the rock (figure 3). Several options were considered, even including an option to build a hybrid particle accelerator-road tunnel in an immersed tube tunnel (which was later scrapped because of potential vibrations caused by traffic disrupting the beamline). The current design compromises on a mid-depth tunnel passing through the permeable moraines on the lake bed.

At the bottom of some of the FCC shafts are large experimental caverns with spans of up to 35 m. To determine the best arrangement for experimental and service caverns, Amberg Engineering carried out a stress analysis (figure 4). Although for data-acquisition purposes it is often desirable to have the two caverns as close as possible to each other, the analysis showed that it would be prohibitively expensive to build a 10 m concrete wall between the caverns. The cheaper option is to use the existing rock as a natural pillar, which would require a minimum spacing of 45 m.

Stress analysis of separated vs adjacent service and experimental caverns

Tunnelling inevitably disturbs the surrounding area. The beamline of the LHC is incredibly sensitive and can detect even the smallest vibrations from the outside world. This was a potential issue for construction works currently taking place for the High-Luminosity LHC project. The contractor had to improvise and modify a standard diesel excavator with an electric motor to eliminate vibrations from the engine. The programme was also adapted so that only the shafts were constructed during operation of the LHC, leaving the more disruptive cavern construction until the start of the current shutdown.

Securing the future

CERN currently has 83 km of underground structures. The FCC would add over 100 km of tunnels, 3720 m of shafts, 26 caverns (not including junction caverns), 66 alcoves and with up to 30 km between the Meyrin campus and the furthest site. The estimated civil-engineering cost for FCC (carried out by ILF Consulting Engineers) is approximately 6 billion Swiss Francs – 45% for tunnels and the rest for shafts, caverns and surface facilities – and benefits from significant advances in tunnelling technology since the LEP-tunnel days (see Advances in civil engineering since the LEP days).

Advances in civil engineering since the LEP days

Herrenknecht’s Mixshield TBM

It has been almost 35 years since three tunnel boring machines (TBMs) set off to carve out the 27 km-long hole that would house LEP and, later, the LHC. Contrary to the recent claims of tech entrepreneur Elon Musk, the technology used to construct modern tunnels has been quietly and rapidly advancing since the construction of LEP, providing a faster, safer and more versatile way to build tunnels. TBMs act as a mobile factory that simultaneously excavates rock from the face and builds a tunnel lining from prefabricated segments behind it. The outer shield of the machine protects workers from falling rock, making sure they are never working in unsupported ground.

One of the main advances in TBM technology is their ability to cope in variable ground conditions. Most of the LEP tunnels were constructed in dry, competent rock, meaning the excavation face needed little support to stand up. Underneath the Jura Mountains, however, pockets of water and soil form where the limestone dissolves into karsts. When a TBM hits this, the water can flow into the tunnels, causing flooding and, at worst, tunnel collapse. Modern TBMs come with a variety of face-support measures, including earth-pressure balance machines that use the excavated soil to push back against the excavated face for support. Herrenknecht’s Mixshield TBM (above) could be used to tunnel the FCC under Lake Geneva, where water-bearing moraines are encountered.

Segmental linings can be constructed off-site in a factory, improving quality, speed and safety. The segments are assembled in the rear of the TBM immediately after excavation. The segments can be fitted with a rubber gasket, which provides a waterproof seal, eliminating the need for the traditional secondary lining. Across the 100 km of the FCC, this will lead to substantial cost savings.

Seismic and sonic scanners can be mounted to the front of the TBM, allowing operators to detect voids or obstacles up to 40 m ahead and adjust their approach accordingly. Probe drilling and pre-support measures can also be implemented from within the machine, meaning that the mining crew is safe and minimising delays to the construction programme.

For vertical shafts, the vertical shaft sinking machine and shaft boring machine are the latest technological breakthroughs, taking all the technology of a TBM and standing it on its end. The giant rig hangs off a crane and excavates below the platform, whilst building a lining above it. The machine can even work underwater to stabilise the shafts during construction.

Traditional tunnelling techniques, which are useful for creating non-standard shapes or smaller tunnels like the experimental caverns in FCC, have come a long way, too. These aren’t the normal sticks of dynamite you see in films or cartoons – highly stable explosives are slotted precisely in holes using a giant rig with multiple arms for speed. The electric detonators can be configured to the millisecond for complex patterns of explosions that give tunnellers precise control of the shape, speed and quality of the excavation.

The safety of the underground areas is critical to ensure the safe and continued operation of the experiments, and CERN has developed advanced tools to inspect the structures – some of which are more than 60 years old. Manually inspecting the condition of the structures on the scale of the FCC will become extremely challenging. We are therefore developing new technologies that will allow us to monitor the condition of the tunnels remotely. Currently, teams are testing out how fibre-optic cables can be attached to the concrete linings to measure movements over time, and developing and training algorithms to be able to spot and characterise faults in the tunnel lining. In the future, the software will be able to measure these faults and compare the changes with previous inspections to assess how they have progressed. To capture these images, a Tunnel Inspection Machine, which runs on the monorail in the roof of the LHC, and a floor-roving inspection robot have both been tested to collect images and data, even when the tunnel is not safe for humans. These images can be rebuilt in a 3D environment and viewed through a virtual-reality headset.

Projects like the FCC and CLIC are not just exciting for physicists. For civil engineers they represent challenges that demand new ideas and technology. At the annual World Tunnel Congress, attended by more than 2000 leading tunnel and underground-space experts, CERN’s FCC has already generated great interest. If approved, it would require the largest construction projects science has ever seen, bequeathing a tunnel that would serve fundamental exploration into the next century. 

Muon g−2 collaboration prepares for first results

The muon g−2 collaboration

The annual “g-2 physics week”, which took place on Elba Island in Italy from 27 May to 1 June, saw almost 100 physicists discuss the latest progress at the muon g−2 experiment at Fermilab. The muon magnetic anomaly, aμ, is one of the few cases where there is a hint of a discrepancy between a Standard Model (SM) prediction and an experimental measurement. Almost 20 years ago, in a sequence of increasingly precise measurements, the E821 collaboration at Brookhaven National Laboratory (BNL) determined aμ = (g–2)/2 with a relative precision of 0.54 parts per million (ppm), providing a rigorous test of the SM. Impressive as it was, the result was limited by statistical uncertainties.

A new muon g−2 experiment currently taking data at Fermilab, called E989, aims to improve the experimental error on aμ by a factor of four. The collaboration took its first dataset in 2018, integrating 40% more statistics than the BNL experiment, and is now coming to the end of a second run that will yield a combined dataset more than three times larger.

A thorough review of the many analysis efforts during the first data run has been conducted. The muon magnetic anomaly is determined from the ratio of the muon and proton precession frequencies in the same magnetic field. The ultimate aim of experiment E989 is to measure both of these frequencies with a precision of 0.1 ppm by employing techniques and expertise from particle-physics experimentation (straw tracking detectors and calorimetetry), nuclear physics (nuclear magnetic resonance) and accelerator science. These frequencies are independently measured by several analysis groups with different methodologies and different susceptibilities to systematic effects.

A recent relative unblinding of a subset of the data with a statistical precision of 1.3 ppm showed excellent agreement across the analyses in both frequencies. The absolute values of the two frequencies are still subject to a ~25 ppm hardware blinding offset, so no physics conclusion can yet be drawn. But the exercise has shown that the collaboration is well on the way to publishing its first result with a precision better than E821 towards the end of the year.

ICTP announces next director

Atish Dabholkar

Atish Dabholkar, a theorist from India, has been appointed the next director of the International Centre for Theoretical Physics (ICTP) in Trieste, Italy. Currently head of ICTP’s high-energy, cosmology and astroparticle physics section, Dabholkar will take up his new position in November. He will succeed Fernando Quevedo, who has led the centre since 2009. Dabholkar’s research has focused on string theory and quantum black holes, and his appointment comes at a time of expansion for ICTP. Over the past 10 years, the centre has hired more researchers and created new research initiatives in quantitative life sciences, high-performance computing, renewable energies and quantum technology. In addition, ICTP has increased its presence with the opening of four partner institutes in Brazil, China, Mexico and Rwanda. “Directing ICTP is a once in a lifetime opportunity due to its unique mission and its big impact in developing countries. I am glad that when I leave in November the institute will be in very good hands,” says Quevedo.

Guido Altarelli Award 2019

Jonathan Gaunt
Josh Bendavid

The fourth edition of the Guido Altarelli Award, which recognises exceptional achievements from young scientists in the field of deep inelastic scattering and related subjects, was awarded during the DIS2019 workshop in Torino, Italy, on 8 April. Jonathan Gaunt of CERN was recognised for his pioneering contributions to the theory and phenomenology of double and multiple parton scattering. Josh Bendavid, also CERN, and a member of the CMS collaboration, received the award for his innovative contributions with original tools to Higgs physics and proton parton density functions at the LHC. The brother of the late Guido Altarelli, Massimo Altarelli, was present at the ceremony and handed the certificates to the two winners.

2019 Dirac Medal and Prize

Viatcheslav Mukhanov, Alexei Starobinsky and Rashid Sunyaev

The International Centre for Theoretical Physics (ICTP) in Italy has awarded its 2019 Dirac Medal and Prize to three physicists whose research has made a profound impact on modern cosmology. Viatcheslav Mukhanov (Ludwig Maximilian University of Munich), Alexei Starobinsky (Landau Institute for Theoretical Physics) and Rashid Sunyaev (Max Planck Institute for Astrophysics) share the prize for “their outstanding contributions to the physics of the cosmic microwave background with experimentally tested implications that have helped to transform cosmology into a precision scientific discipline by combining microscopic physics with the large-scale structure of the universe”.

Julius Wess Award 2018

Sally Dawson

Sally Dawson of Brookhaven National Laboratory has been granted the 2018 Julius Wess Award by the KIT Elementary Particle and Astroparticle Physics Center of Karlsruhe Institute of Technology. She is recognised for her outstanding scientific contributions to the theoretical description and in-depth understanding of processes in hadron colliders, in particular her work relating to the physics of the Higgs boson and top quark. The Julius Wess Award is endowed with €10,000 and is granted annually to elementary particle and astroparticle physicists for exceptional experimental or theoretical scientific achievements.

bright-rec iop pub iop-science physcis connect