The LHC completed its Run 2 operations in December 2018, delivering a large dataset of proton–proton collisions at a centre-of-mass energy of 13 TeV. The ATLAS detector maintained a high level of readiness and performance throughout Run 2, resulting in 139 fb–1 of data for physics analyses.
An increasingly consistent picture of the properties of the Higgs boson is being drawn in light of the Run 2 data. This is thanks to a wide range of measurements, and particularly through the establishment of its couplings with third-generation quarks following the observation of the H → bb decay and associated ttH production.
The H → γγ and H → ZZ* → 4ℓ final states, where 4ℓ denotes 4e, 2e2μ or 4μ, provide clean experimental signatures that played a leading role in the discovery of the Higgs boson, and are ideal for precision measurements that could reveal subtle effects from new physics. ATLAS presented updated results for these two channels using the full Run 2 dataset at the 2019 summer conferences.
Using improved identification and energy calibration of leptons, photons and jets, and new analysis techniques, a sample of about 210 H → ZZ* → 4ℓ signal events (figure 1) and 6550 H → γγ signal events were selected to perform a series of measurements. The properties of the Higgs boson are investigated by measuring inclusive, differential and per-production-mode cross sections that are sensitive to different modelling aspects.
In the 4ℓ channel, differential cross-section measurements are performed as a function of the transverse momentum of the Higgs boson and the number of jets produced in association with it. The different production mechanisms of the Higgs boson are measured inclusively and in various regions of kinematic phase space, which are cleanly separated by neural networks.
In the high-statistics γγ channel, differential cross sections are measured for a set of variables related to the Higgs boson kinematics, as well as the kinematics and multiplicity of jets produced in association with the Higgs boson. The measured distributions are used to constrain modified interactions of the Higgs boson with SM particles.
The measurements in both channels are found to be well described by the SM predictions. Their combination yields a total Higgs-production cross section of 55.4 ± 4.3 pb, in agreement with the SM prediction of 55.6 ± 2.5 pb. The combined measurement of the transverse-momentum differential cross section (figure 2) has significantly improved in precision compared to earlier results. It is sensitive to the virtual processes governing the dominant Higgs-boson production through gluon fusion and to direct contributions from new physics.
Achieving 8% precision on the Higgs cross section is a significant step towards studying the electroweak symmetry breaking mechanism. Numerous additional measurements are being pursued by ATLAS in the Higgs-boson sector with the full Run 2 dataset to perform detailed tests of SM predictions and hunt for new phenomena.
The Humboldt Kolleg conference Discoveries and Open Puzzles in Particle Physics and Gravitation took place at Kitzbühel in the Austrian Alps from 24 to 28 June, bringing Humboldt prize winners, professors and research-fellow alumni together with prospective future fellows. The meeting was sponsored by the Humboldt Foundation, based in Bonn, whose mission is to promote cooperation between scientists in Germany and elsewhere. The programme focused on connections between particle physics and the large-scale cosmological structure of the universe.
The most recent LHC experimental results were presented by Karl Jakobs (Freiburg and ATLAS spokesperson), confirming the status of the Standard Model (SM). A key discussion topic raised by Fred Jegerlehner (DESY-Zeuthen) is whether the SM’s symmetries might be “emergent” at the relatively low energies of current experiments: in contrast to unification models that exhibit maximal symmetry at the highest energies, the gauge symmetries could emerge in the infrared, but “dissolve” in the extreme ultraviolet. Consider the analogy of a carpet: it looks flat and invariant under translations when viewed from a distance, but this smoothness dissolves when we look at it close up, e.g. as perceived by an ant crawling on it. A critical system close to the Planck scale – the scale where quantum-gravity effects should be important – could behave similarly: the only modes that can exist as long-range correlations, e.g. light-mass particles, self-organise into multiplets with a small number of particles, just as they do in the SM. The vector modes become the gauge bosons of U(1), SU(2) and SU(3); low-energy symmetries such as baryon- and lepton-number conservation would all be violated close to the Planck scale.
Ideas connecting particle physics and quantum computing were also discussed by Peter Zoller (Innsbruck) and Erez Zohar (MPQ, Munich). Here, one takes a lattice field theory that is theoretically difficult to solve and maps it onto a fully controllable quantum system such as an optical lattice that can be programmed in experiments to do calculations – a quantum simulator. First promising results with up to 20 qubits have been obtained for the Schwinger model (QED in 1+1 dimensions). This model exhibits dynamical mass generation and is a first prototype before looking at more complicated theories like QCD.
The cosmological constant is related to the vacuum energy density, which is in turn connected to possible phase transitions in the early universe.
A key puzzle concerns the hierarchies of scales: the small ratio of the Higgs-boson mass to the Planck scale plus the very small cosmological constant that drives the accelerating expansion of the universe. Might these be related? The cosmological constant is related to the vacuum energy density, which is in turn connected to possible phase transitions in the early universe. Future gravitational-wave experiments with LISA were discussed by Stefano Vitale (Trento) and are expected to be sensitive to the effects of these phase transitions.
A main purpose of Humboldt Kolleg is the promotion of young scientists from the central European region. Student poster prizes sponsored by the Kitzbühel mayor Klaus Winkler were awarded to Janina Krzysiak (IFJ PAN, Krakow) and Jui-Lin Kuo (HEPHY, Vienna).
Almost 750 high-energy physicists met from 10–17 July in Ghent, Belgium, for the 2019 edition of EPS-HEP. The full scope of the field was put under a microscope by more than 500 parallel and plenary talks and a vibrant poster session. The ongoing update of the European Strategy for Particle Physics (ESPP) was a strong focus, and the conference began with a session jointly organised by the European Committee for Future Accelerators to seek further input from the community ahead of the publication of the ESPP briefing book in September.
The accepted view, explained ESPP secretary Halina Abramowicz, is that an electron–positron collider should succeed the Large Hadron Collider (LHC). The question is whether to build a linear collider that is extendable to higher energies, or a circular collider whose infrastructure could later be reused for a hadron collider. DESY’s Christophe Grojean weighed up the merits of a Large Electron Positron collider (LEP)-style Z-pole run at a high-luminosity circular machine – a “tera-Z factory” – against the advantages of the polarised beams proposed at linear facilities, and questioned the value of polarisation to measurements of the Higgs boson at energies above 250 GeV. Furthermore, he said, sensitivities should be evaluated in light of the expected performance of the high-luminosity LHC (HL-LHC).
Blue skies required
Presentations on accelerator and detector challenges emphasised the importance of sharing development between competing projects: while detector technology for an electron–positron machine could begin production within about five years, proposed hadron colliders require a technological leap in both radiation hardness and readout speed. CERN’s Ariella Cattai expressed concern for excessive utilitarianism in detector development, with only 5% of R&D being blue-sky despite the historical success of this approach in developing TPC, RICH and silicon strip detectors, among others. She also pointed out that although 80% of R&D specialists believe their work has potential social outcomes, less than a third feel adequately supported to engage in technology transfer. Delegates agreed on the need for more recognition for those who undertake this crucial work. CERN’s Graeme Stewart highlighted the similar plight of theorists developing event generators, whose work is often not adequately rewarded or supported. The field also needs to keep pace with computing developments outside the field, he said, by designing data models and code that are optimised for graphics-processing units rather than CPUs (central-processing units).
The accepted view is that an electron–positron collider should succeed the LHC
The beginning of the main EPS conference was dominated by impressive new results from ATLAS and CMS, as they begin to probe Higgs couplings to second-generation fermions, and as the experiments continue to search for new phenomena and rare processes. Several speakers noted that the LHC even has the potential to exceed LEP in precision electroweak physics: although the hadronic environment increases systematic uncertainties, deviations arising from beyond-Standard Model (SM) phenomena are expected to scale with the centre-of-mass energy squared. Giulia Zanderighi of the Max Planck Institute and Claude Duhr of CERN also highlighted the need to improve the precision of theoretical calculations if they are to match experimental precision by the end of the HL-LHC’s run, showcasing work to extend next-to-next-to-leading order (NNLO) calculations to two-to-three processes, and the latest moves to N3LO calculations.
The flavour-physics scene was updated with new SM-consistent constraints from Belle on the ratios R(D) and R(D*), somewhat lessening the suggestion of lepton-universality violation in B-meson decays. With the advent of Belle II, and the impending analysis of LHCb’s full Run 2 dataset, the flavour anomalies will surely soon be confirmed or resolved. LHCb also presented new measurements of the gamma angle of the unitarity triangle, which show a mild 2σ tension between the values obtained from B+ and Bs0 decays. Meanwhile, long-baseline neutrino-oscillation experiments provided tantalising information on leptonic CP violation, with T2K data excluding CP conservation at 2σ irrespective of the neutrino mass hierarchy, and NOVA disfavouring an inverted hierarchy of neutrino mass eigenstates at 1.9σ.
Background checks
A refrain common to both collider and non-collider searches for dark-matter candidates was the need to eliminate backgrounds. A succession of talks scaled the 90 orders of magnitude in mass that dark-matter candidates might occupy. CERN’s Kfir Blum explained that: “The problem with gravity is that it doesn’t matter if you’re a neutrino or a rhinoceros – if you sit on a geodesic you’re going to move in the same way,” making it difficult to infer the nature of dark matter with cosmological arguments. Nevertheless, he reported work on the recent black-hole image from the Event Horizon Telescope, which excludes some models of ultra-light dark matter. Above this, helioscopes such as CAST continue to encroach on the parameter space of QCD axions, while more novel haloscopes cut thin swathes down to low couplings in the 20 orders of magnitude of mass explored by searches for axion-like particles. Meanwhile, searches for WIMPs are sensitive to masses just beyond this, from 1 to 1000 GeV/c2. Carlos de los Heros of Uppsala University explained that experiments such as XENON1t are pushing close to the so-called neutrino floor, and advocated for the development of directional detection methods that can distinguish solar neutrinos from WIMPs, and plunge into what is rather a neutrino “swamp”.
An exciting synergy between heavy-ion physics and gravitational waves was in evidence, with the two disparate approaches both now able to probe the equation of state of nuclear matter. Particular emphasis was placed on the need to marry the successful hydrodynamical and statistical description of ion–ion collisions with that used to describe proton–proton collisions, especially in the tricky proton-ion regime. These efforts are already bearing fruit in jet modelling. On the cosmological side, speakers reflected on the enduring success of the ΛCDM model to describe the universe in just six parameters, with François Bouchet of the Institut d’Astrophysique de Paris declaring that “the magic of the cosmic microwave background is not dead”, and explaining that Planck data have ruled out several models of inflation. Interdisciplinarity was also on display in reports on multi-messenger astronomy, with particular excitement reserved for the proposed European-led Einstein Telescope gravitational-wave observatory, which Marek Kowalski of DESY reported will most likely be built in either Italy or the Netherlands, and that will boast 10-times better sensitivity than current instruments.
This year’s EPS prize ceremony rewarded the CDF and D0 collaborations for the discovery of the top quark, and the WMAP and Planck collaborations for their outstanding contributions to astroparticle physics and cosmology. Today’s challenges are arguably even greater, and the spirit of EPS-HEP 2019 was to reject a false equivalence between physics being “new” and being beyond the SM. Participants’ hunger for the technological innovation required to answer the many remaining open questions was matched by an openness to reconsider theoretical thinking on fine tuning and naturalness, and how these principles inform the further exploration of the field.
EPS-HEP 2021 will take place in Hamburg from 21–28 July.
More than 400 researchers convened in Brussels from 24 to 28 June for the annual meeting of the Future Circular Collider (FCC) study. In addition to innovations in superconductivity, high-field magnets, superconducting radio-frequency systems and civil-engineering studies, discussions sought to clarify issues surrounding the physics research topics that FCC can address.
The meeting also marked the final event of the Horizon 2020 EuroCirCol project – a European Union project to produce a conceptual design study for a post-LHC research infrastructure based on an energy-frontier 100 TeV circular hadron collider. Since June 2015 the project has produced a wealth of results in high-tech domains via the collaborative efforts of partners in Europe and other countries such as the US, Japan, Korea and Russia. These include impressive progress toward 16 T magnets and in the performance of superconducting wires. Breakthroughs in both fields, such as a first accelerator-type magnet exceeding 14 T (see Advanced dipole sets high-field record) and an increase in the critical current density of Nb3Sn wire, promise to significantly reduce the costs of exploring the high-energy frontier and could find practical applications outside particle physics.
The four-volume FCC conceptual design report was also presented. Authored by 1350 people from 150 institutes, the report “underlines the global attractiveness of the FCC and documents the far-reaching benefits that the project can have for Europe and future generations,” said Frédérick Bordry, CERN director for accelerators and technologies.
A wide range of talks focused on a future circular lepton collider (FCC-ee) as the first step of the FCC programme, followed by an energy-frontier proton collider (FCC-hh). Results testify to the technological readiness of the FCC-ee, which could be operational by the end of the 2030s and therefore allow time to develop the novel technologies required for a 100 TeV proton–proton collider.
In his keynote talk, Nima Arkani- Hamed of the Institute for Advanced Study highlighted the importance of scrutinising the Higgs boson at a post-LHC machine. Speakers also stressed the complementarity between the different FCC options in searching for dark-matter candidate particles and other new physics. Finally, the potential for studying the strong interaction with heavy-ion collisions, and detailing parton distribution functions with a proton–electron interaction point, were demonstrated.
The sustainability of research infrastructures and the assessment of their societal impact were other highlights of FCC week 2019, as discussed at a special “Economics of Science” workshop. Experts from the field of economics shared lessons learned with representatives from CERN and other research organisations, including SKA, ESA and ESS, demonstrating the many benefits beyond physics that major international projects bring.
The XVIII International Conference on Strangeness in Quark Matter (SQM 2019) was held from 10 to 15 June in Bari, Italy. With 270 delegates from 32 countries, the largest participation ever for the SQM series, the conference focused on the role of strange and heavy-flavour quarks in heavy-ion collisions and astrophysics. The scientific programme consisted of 50 invited plenary talks, 76 contributed parallel talks and a rich poster session with more than 60 contributions.
A state-of-the-art session opened the conference, also including a tribute to the late Roy Glauber entitled “The Glauber model in high-energy nucleus–nucleus collisions”. Subsequent sessions were dedicated to highlights from theory and experiment, and included reports on results from low- and high-energy collisions, as well as on hyperon interactions in lattice QCD and thermal models. Representatives from all major collaborations at CERN’s LHC and SPS, Brookhaven’s RHIC, the Heavy Ion Synchrotron SIS at the GSI Darmstadt and the NICA project at the JINR Dubna made special efforts to release new results at SQM 2019.
Among the highlights were reports that particle-yield measurements are close to determining where phenomena such as strangeness enhancement are localised in phase space. Collective behaviour in small systems was also a much-discussed topic, with new results from the PHENIX experiment showing that p-Au, d-Au and 3He-Au collisions exhibit elliptic flow coefficients consistent with expectations regarding their initial collision geometry. Results from ALICE, CMS and STAR consistently corroborate the presence of elliptic flow in small systems.
There is also increasing interest in transverse-momentum differential baryon-to-meson ratios in the heavy-flavour sector. Recent results from pp and Pb-Pb collisions from both ALICE and CMS suggest that the same dynamics observed in the ratio Λ/K0S may be present in Λc/D, despite the fact that strange and charm quarks are thought to be created in different stages of the system’s evolution. Further studies and future measurements may be needed.
A promising new perspective for the LHC data is to use high-energy pp and p-Pb collisions as factories of identified hadrons created by a source of finite radius and then to measure the ensuing interactions between these hadrons using femtoscopy. This technique has allowed the ALICE collaboration to study interactions that were so far not measured at all and probe, for instance, the p-Ξ and p-Ω interaction potentials. These results provide fundamental constraints to the QCD community and are significant in the context of the astrophysics.
New results on the onset of deconfinement were shown by the NA61/SHINE collaboration. First results on strangeness production at low energy from HADES and BM@N also enriched the discussion at SQM 2019.
Presentations at the final session showed good prospects for future measurements at FAIR (GSI Darmstadt), NICA (JINR Dubna), the Heavy-Ion Project (J-PARC), and at CERN, givenongoing detector upgrades, the high-luminosity programme, and possible next-generation colliders. Perspectives for QCD measurements at future electron–ion colliders were also presented. On the theory side, new developments and strong research efforts are bringing a better understanding of strangeness production and open heavy-flavour dynamics in heavy-ion collisions.
Young scientist prizes sponsored by the Nuclear Physics European Collaboration Committee were awarded to Bong-Hwi Lim of Pusan National University, Korea, and to Olga Soloveva of Goethe University, Frankfurt for their poster contributions. The inaugural Andre Mischke Award (established at SQM2019) for the young scientist with the best experimental parallel talk was given to Erin Frances Gauger of the University of Texas, Austin.
The next edition of SQM will take place in Busan, Korea, in May 2021.
Processes where the flavour of charged leptons is not conserved are undetectably rare in the Standard Model (SM). For neutral leptons, flavour violation is known to occur in neutrino oscillations, but charged-lepton-flavour violation (CLFV) is so suppressed that, if observed, it would provide indisputable evidence of physics beyond the SM.
The LHCb collaboration recently reported the results of searches for two CLFV decays, B+→ K+μ± e∓and B(s)0→ τ±μ∓, using 3 fb–1 of data collected in 2011 and 2012. The two decays provide complementary information as their final states involve charged leptons from different families, and both represent experimental challenges for LHCb. While the detector performance is excellent for muons, it is more difficult to reconstruct electrons and taus. The difficulty with electrons is related to energy losses via bremsstrahlung radiation. Meanwhile, the short-lived tau leptons are always reconstructed from their decay products, which include at least one neutrino, and thus part of the tau’s energy is unavoidably lost. In both cases, the analyses are able to recover some of the lost information and improve the resolution by exploiting constraints on the kinematics and topology of the decay.
Neither search found a signal (figure 1), but thanks to these reconstruction techniques and the large quantity of B-meson decays recorded by the detector, LHCb has established the most stringent upper limits on the branching fractions of these decays: 9.5 × 10–9 for B+→ K+μ− e+, 8.8 × 10–9 for B+→ K+μ+ e–, 1.4 × 10–5 for B0→ τ±μ∓, and 4.2 × 10–5 for Bs0→ τ±μ∓ (all at the 95% confidence level). The latter is also the first ever limit on Bs0→ τ±μ∓.
Decays of B-mesons are particularly interesting in light of recent flavour anomalies
CLFV decays of B-mesons are particularly interesting in light of recent flavour anomalies, whereby LHCb found hints that the decay rates for b → sμ+μ– and b → se+e– are not equal (CERN Courier May/June 2019 p33). While the anomalies are most suggestive of the violation of lepton flavour universality, several proposed extensions to the SM that address them also predict CLFV, with branching ratios for B+→ K+μ± e∓and B(s)0→ τ±μ∓, which are within LHCb’s reach. The latest LHCb results therefore impose strong new constraints on beyond-SM models. The analyses also open the door to further LHCb tests of CLFV by demonstrating the feasibility of searches for rare processes with final-state electrons and taus.
At the 22nd edition of the Planck conference series, which took place in Granada, Spain, from 3–7 June, 170 particle physicists and cosmologists discussed the latest in beyond the Standard Model (BSM) physics and ultraviolet completions of the SM within theories that unify the fundamental interactions.
Several speakers addressed the serious model-building restrictions in supersymmetry and Higgs compositeness that are imposed by the negative results of direct searches for BSM particles at ATLAS and CMS. Particular emphasis was put on the (extended) Higgs sector of the SM, where precision measurements might detect signals of BSM physics. Updates from LHCb and Belle on the flavour anomalies were also eagerly discussed, with proposed explanations including leptoquarks and additional U(1) gauge symmetries with exotic vector-like quarks. However, not all were convinced that the results signal BSM physics. On the cosmological side, delegates learned of the latest attempts to build models of WIMPs, axions, magnetic relics and dark radiation, which also include mechanisms for baryogenesis and inflation in the early universe.
Given the absence of new BSM particles so far at the LHC, theorists talk of a “desert” beyond the weak and Planck scales containing nothing but SM particles. Several speakers reported that phase transitions between non-trivial Higgs vacua could lead to violent phenomena in the early universe that might be tested by future gravitational-wave detectors. Within the inflationary universe these phenomena might also lead to the production of primordial black holes that could explain dark matter.
Discussions of ultraviolet (i.e. high-energy) completions of the SM encompassed the grand unification of fundamental interactions, the origin of neutrino masses, flavour symmetries and the so-called “swampland conjectures”, which characterise theories that might not be compatible with a consistent theory of quantum gravity. Therefore, one might hope that healthy signals of BSM physics might appear somewhere between the desert and the swampland.
Planck 2020 will be held from 8-12 June in Durham, UK.
In the early 1970s the term “Standard Model” did not yet exist – physicists used “Weinberg–Salam model” instead. But the discovery of the weak neutral current in Gargamelle at CERN in 1973, followed by the prediction and observation of particles composed of charm quarks at Brookhaven and SLAC, quickly shifted the focus of particle physicists from the strong to the electroweak interactions – a sector in which trailblazing theoretical work had quietly taken place in the previous years. Plans for an electron–positron collider at CERN were soon born, with the machine first named LEP (Large Electron Positron collider) in a 1976 CERN yellow report authored by a distinguished study group featuring, among others, John Ellis, Burt Richter, Carlo Rubbia and Jack Steinberger.
LEP’s size – four times larger than anything before it – was chosen from the need to observe W-pair production, and to check that its cross section did not diverge as a function of energy. The phenomenology of the Z-boson’s decay was to come under similar scrutiny. At the time, the number of fermion families was undefined, and it was even possible that there were so many neutrino families that the Z lineshape would be washed out. LEP’s other physics targets included the possibility of producing Higgs bosons. At the time, the mass of the Higgs boson was completely unknown and could have been anywhere from around zero to 1 TeV.
The CERN Council approved LEP in October 1981 for centre-of-mass energies up to 110 GeV. It was a remarkable vote of confidence in the Standard Model (SM), given that the W and Z bosons had not yet been directly observed. A frantic period followed, with the ALEPH, DELPHI, L3 and OPAL detectors approved in 1983. Based on similar geometric principles, they included drift chambers or TPCs for the main trackers, BGO crystals, lead–glass or lead–gas sandwich electromagnetic calorimeters, and, in most cases, an instrumented return yoke for hadron calorimetry and muon filtering. The underground caverns were finished in 1988 and the detectors were in various stages of installation by the end of spring 1989, by which time the storage ring had been installed in the 27 km-circumference tunnel (see The greatest lepton collider).
Expedition to the Z pole
The first destination was the Z pole at an energy of around 90 GeV. Its location was then known to ±300 MeV from measurements of proton–antiproton collisions at Fermilab’s Tevatron. The priority was to establish the number of light neutrino families, a number that not only closely relates to the number of elementary fermions but also impacts the chemical composition and large-scale structure of the universe. By 1989 the existence of the νe, νμ and ντ neutrinos was well established. Several model-dependent measurements from astrophysics and collider physics at the time had pointed to the number of light active neutrinos (Nν) being less than five, but the SM could, in principle, accommodate any higher number.
The initial plan to measure Nν using the total width of the Z resonance was quickly discarded in favour of the visible peak cross section, where the effect was far more prominent – and in first approximation, insensitive to new possible detectable channels. The LEP experiments were therefore thrown in at the deep end, needing to make an absolute cross-section measurement with completely new detectors in an unfamiliar environment that demanded triggers, tracking, calorimetry and the luminosity monitors to all work and acquire data in synchronisation.
On the evening of 13 August, during a first low-luminosity pilot run just one month after LEP achieved first turns, OPAL reported the first observation of a Z decay (see OPAL fruits). Each experiment quickly observed a handful more. The first Z-production run took place from 18 September to 9 October, with the four experiments accumulating about 3000 visible Z decays each. They took data at the Z peak and at 1 and 2 GeV either side, improving the precision on the Z mass and allowing a measurement of the peak cross section. The results, including those from the Mark II collaboration at SLAC’s linear electron–positron SLC collider, were published and presented in CERN’s overflowing main auditorium on 13 October.
After only three weeks of data taking and 10,000 Z decays, the number of neutrinos was found to be three. In the following years, some 17 million Z decays were accumulated, and cross-section measurement uncertainties fell to the per-mille level. And while the final LEP number – Nν = 2.9840 ± 0.0082 – may appear to be a needlessly precise measurement of the number three (figure 1a), it today serves as by far the best high-energy constraint on the unitarity of the neutrino mixing matrix. LEP’s stash of a million clean tau pairs from Z → τ+τ–– decays also allowed the universality of the lepton–neutrino couplings to the weak charged current to be tested with unprecedented precision. The present averages are still dominated by the LEP numbers: gτ/gμ = 1.0010 ± 0.0015 and gτ/ge = 1.0029 ± 0.0015.
LEP continued to carry out Z-lineshape scans until 1991, and repeated them in 1993 and 1995. Two thirds of the total luminosity was recorded at the Z pole. As statistical uncertainties on the Z’s parameters went down, the experiments were challenged to control systematic uncertainties, especially in the experimental acceptance and luminosity. Monte Carlo modelling of fragmentation and hadronisation was gradually improved by tuning to measurements in data. On the luminosity front it soon became clear that dedicated monitors would be needed to measure small-angle Bhabha scattering (e+e–→ e+e–), which proceeds at a much higher rate than Z production. The trick was to design a compact electromagnetic calorimeter with sufficient position resolution to define the geometric acceptance, and to compare this to calculations of the Bhabha cross section.
The final ingredient for LEP’s extraordinary precision was a detailed knowledge of the beam energy, which required the four experiments to work closely with accelerator experts. Curiously, the first energy calibration was performed in 1990 by circulating protons in the LEP ring – the first protons to orbit in what would eventually become the LHC tunnel, but at a meagre energy of 20 GeV. The speed of the protons was inferred by comparing the radio-frequency electric field needed to keep protons and electrons circulating at 20 GeV on the same orbit, allowing a measurement of the total magnetic bending field on which the beam energy depends. This gave a 20 MeV uncertainty on the Z mass. To reduce this to 1.7 MeV for the final Z-pole measurement, however, required the use of resonant depolarisation routinely during data taking. First achieved in 1991, this technique uses the natural transverse spin polarisation of the beams to yield an instantaneous measurement of the beam energy to a precision of ±0.1 MeV – so precise that it revealed minute effects caused, for example, by Earth’s tides and the passage of local trains (see Tidal forces, melting ice and the TGV to Paris). The final precision was more than 10 times better than had been anticipated in pre-LEP studies.
Electroweak working group
The LEP electroweak working group saw the ALEPH, DELPHI, L3 and OPAL collaborations work closely on combined cross-section and other key measurements – in particular the forward-backward asymmetry in lepton and b-quark production – at each energy point. By 1994, results from the SLD collaboration at SLAC were also included. Detailed negotiations were sometimes needed to agree on a common treatment of statistical correlations and systematic uncertainties, setting a precedent for future inter-experiment cooperation. Many tests of the SM were performed, including tests of lepton universality (figure 1b), adding to the tau lepton results already mentioned. Analyses also demonstrated that the couplings of leptons and quarks are consistent with the SM predictions.
The combined electroweak measurements were used to make stunning predictions of the top-quark and Higgs-boson masses, mt and mH. After the 1993 Z-pole scan, the LEP experiments were able to produce a combined measurement of the Z width with a precision of 3 MeV in time for the 1994 winter conferences, allowing the prediction mt = 177 ± 13 ± 19 GeV where the first error is experimental and the second is due to mH not being known. A month later the CDF collaboration at the Tevatron announced the possible existence of a top quark with a mass of 176 ± 16 GeV. Both CDF and its companion experiment D0 reached 5σ “discovery” significance a year later. It is a measure of the complexity of the Z-boson analyses (in particular the beam-energy measurement) that the final Z-pole results were published a full 11 years later, constraining the Higgs mass to be less than 285 GeV at 95% confidence level (figure 1c), with a best fit at 129 GeV.
From QCD to the W boson
LEP’s fame in the field tends to concern its electroweak breakthroughs. But, with several million recorded hadronic Z decays, the LEP experiments also made big advances in quantum chromodynamics (QCD). These results significantly increased knowledge of hadron production and quark and gluon dynamics, and drove theoretical and experimental methods that are still used extensively today. LEP’s advantage as a lepton collider was to have an initial state that was independent of nucleon structure functions, allowing the measurement of a single, energy-scale-dependent coupling constant. The strong coupling constant αs was determined to be 0.1195 ± 0.0034 at the Z pole, and to vary with energy – the highlight of LEP’s QCD measurements. This so-called running of αs was verified over a large energy range, from the tau mass up to 206 GeV, yielding additional experimental confirmation of QCD’s core property of asymptotic freedom (figure 2a).
Many other important QCD measurements were performed, such as the gluon self-coupling, studies of differences between quark and gluon jets, verification of the running b-quark mass, studies of hadronisation models, measurements of Bose–Einstein correlations and detailed studies of hadronic systems in two-photon scattering processes. The full set of measurements established QCD as a consistent theory that accurately describes the phenomenology of the strong interaction.
Following successful Z operations during the “LEP1” phase in 1989–1995, a second LEP era devoted to accurate studies of W-boson pair production at centre-of-mass energies above 160 GeV got under way. Away from the Z resonance, the electron-positron annihilation cross section decreases sharply; as soon as the centre-of-mass energy reaches twice the W and Z boson masses, the WW, then ZZ, production diagrams open up (figure 2b). Accessing the WW threshold required the development of superconducting radio-frequency cavities, the first of which were already installed in 1994, and they enabled a gradual increase in the centre-of-mass energy up to a maximum of 209 GeV in 2000.
The “LEP2” phase allowed the experiments to perform a signature analysis, which dated back to the first conception of the machine: the measurement of the WW-boson cross section. Would it diverge or would electroweak diagrams interfere to suppress it? The precise measurement of the WW cross section as a function of the centre-of-mass energy was a very important test of the SM since it showed that the sum and interference of three four-fermion processes were indeed acting in the WW production: the t-channel ν exchange, and the s-channel γ and Z exchange (figure 2c). LEP data proved that the γWW and ZWW triple gauge vertexes are indeed present and interfere destructively with the t-channel diagram, suppressing the cross section and stopping it from diverging.
The second key LEP2 electroweak measurement was of the mass and total decay width of the W boson, which were determined by directly reconstructing the decay products of the two W bosons in the fully hadronic (W+W–→ qqqq) and semi-leptonic (W+W–→ qqℓνℓ) decay channels. The combined LEP W-mass measurement from direct reconstruction data alone is 80.375 ± 0.025(stat) ± 0.022(syst) GeV, the largest contribution to the systematic uncertainties originating from fragmentation and hadronisation uncertainties. The relation between the Z-pole observables, mt and mW, provides a stringent test of the SM and constrains the Higgs mass.
To the Higgs and beyond
Before LEP started, the mass of the Higgs boson was basically unknown. In the simplest version of the SM, involving a single Higgs boson, the only robust constraints were its non-observation in nuclear decays (forbidding masses below 14 MeV) and the need to maintain a sensible, calculable theory (ruling out masses above 1 TeV). In 1990, soon after the first LEP data-taking period, the full Higgs-boson mass range below 24 GeV was excluded at 95% confidence level by the LEP experiments. Above this mass the main decay of the Higgs boson, occurring 80% of the time, was predicted to be its decays into b quark–antiquark pairs, followed by pairs of tau leptons, charm quarks or gluons, while the WW* decay mode starts to contribute at the maximum reachable masses of approximately 115 GeV. The main production process is Higgs-strahlung, whereby a Higgs is emitted by a virtual Z boson.
The combined electroweak measurements were used to make stunning predictions of the top quark and Higgs boson masses
During the full lifetime of LEP, the four experiments kept searching for neutral and charged Higgs bosons in several models and exclusion limits continued to improve. In its last year of data taking, when the centre-of-mass energy reached 209 GeV, ALEPH reported an excess of four-jet events. It was consistent with a 114 GeV Higgs boson and had a significance that varied as the data were accumulated, peaking at an instantaneous significance of around 3.9 standard deviations. The other three experiments carefully scrutinised their data to confirm or disprove ALEPH’s suggestion, but none observed any long-lasting excess in that mass region. Following many discussions, the LEP run was extended until 8 November 2000. However, it was decided not to keep running the following year so as not to impact the LHC schedule. The final LEP-wide combination excluded, at 95% confidence level, a SM Higgs boson with mass below 114.4 GeV.
The four LEP experiments carried out many other searches for novel physics that set limits on the existence of new particles. Notable cases are the searches for additional Higgs bosons in two-Higgs-doublet models and their minimal supersymmetric incarnation. Neutral scalar and pseudoscalar Higgs bosons lighter than the Z boson and charged Higgs bosons up to the kinematic limit of their pair production were also excluded. Supersymmetric particles suffered a similar fate, in the theoretically attractive assumption of R-parity conservation. The existence of sleptons and charginos was excluded in the largest part of the parameter space for masses below 70–100 GeV, near the kinematic limit for their pair production. Neutralinos with masses below approximately half the Z-boson mass were also excluded in a large part of the parameter space. The LEP exclusions for several of these electroweak-produced supersymmetric particles are still the most stringent and most model-independent limits ever obtained.
It is very hard to remember how little we knew before LEP and the giant step that LEP made. It was often said that LEP discovered electroweak radiative corrections at the level of 5σ, opening up a precision era in particle physics that continues to set the standard today and offer guidance on the elusive new physics beyond the SM.
A few minutes before midnight on a summer’s evening in July 1989, 30 or so people were crammed into a back room at CERN’s Prévessin site in the French countryside. After years of painstaking design and construction, we were charged with breathing life into the largest particle accelerator ever built. The ring was complete, the aperture finally clear and the positron beam made a full turn on our first attempt. Minutes later beams were circulating, and a month later the first Z boson event was observed. Here began a remarkable journey that firmly established the still indefatigable Standard Model of particle physics.
So, what can go wrong when you’re operating 27 kilometres of particle accelerator, with ultra-relativistic leptons whizzing around the ring 11,250 times a second? The list is long. The LEP ring was packed with magnets, power converters, a vacuum system, a control system, a cryogenics system, a cooling and ventilation system, beam instrumentation – and much more. Then there was the control system, fibres, networks, routers, gateways, software, databases, separators, kickers, beam dump, radio-frequency (RF) cavities, klystrons, high-voltage systems, interlocks, synchronisation, timing, feedback… And, of course, the experiments, the experimenters and everybody’s ability to get along in a high-pressure environment.
LEP wasn’t the only game in town. There was fierce competition from the more innovative Stanford Linear Collider (SLC) in California. But LEP was off to a fantastic start and its luminosity increase was much faster than at its relatively untested linear counterpart. A short article capturing the transatlantic rivalry appeared in the Economist on 19 August 1989. “The results from California are impressive,” the magazine reported, “especially as they come from a new and unique type of machine. They may provide a sure answer to the generation problem before LEP does. This explains the haste with which the finishing touches have been applied to LEP. The 27 km-long device, six years in the making, was transformed from inert hardware to working machine in just four weeks – a prodigious feat, unthinkable anywhere but at CERN. Even so, it was still not as quick as Carlo Rubbia, CERN’s domineering director-general might have liked.”
Notes from the underground
LEP’s design dates from the late 1970s, the project being led by accelerator-theory group leader Eberhard Keil, RF group leader Wolfgang Schnell and C J “Kees” Zilverschoon. The first decision to be made was the circumference of the tunnel, with four options on the table: a 30 km ring that went deep into the Jura mountains, a 22 km ring that avoided them entirely, and two variants with a length of 26.7 km that grazed the outskirts of the mountains. Then director-general Herwig Schopper decided on a circumference of 26.7 km with an eye on a future proton collider for which it would be “decisive to have as large a tunnel as possible” (CERN Courier July/August 2019 p39). The final design was approved on 30 October 1981 with Emilio Picasso leading the project. Construction of the tunnel started in 1983, after a standard public enquiry in France.
LEP’s tunnel, the longest-ever attempted prior to the Channel Tunnel, which links France and Britain, was carved by three tunnel-boring machines. Disaster struck just two kilometres into the three-kilometre stretch of tunnel in the foothills of the Jura, where the rock had to be blasted because it was not suitable for boring. Water burst in and formed an underground river that took six months to eliminate (figure 1). By June 1987, however, part of the tunnel was complete and ready for the accelerator to be installed.
Just five months after the difficult excavation under the Jura, one eighth of the accelerator (octant 8) had been completely installed, and, a few minutes before midnight on 12 July 1988, four bunches of positrons made the first successful journey from the town of Meyrin in Switzerland (point 1) to the village of Sergy in France (point 2), a distance of 2.5 km. Crucially, the “octant test” revealed a significant betatron coupling between the transverse planes: a thin magnetised nickel layer inside the vacuum chambers was causing interference between the horizontal and vertical focusing of the beams. The quadrupole magnets were adjusted to prevent a resonant reinforcement of the effect each turn, and the nickel was eventually demagnetised.
Giving birth to LEP
The following months saw a huge effort to install equipment in the remaining 24 km of the tunnel – magnets, vacuum chambers and RF cavities, as well as beam instrumentation, injection equipment, electrostatic separators, electrical cabling, water cooling, ventilation and all the rest. This was followed by conditioning the cavities, baking out and leak-testing the vacuum chambers, and individual testing. At the same time a great deal of effort to prepare the software needed to operate the collider was made with limited resources.
In the late 1980s, control systems for accelerators were going through a major transition to the PC. LEP was caught up in the mess and there were many differences of opinion on how to design LEP’s control system. As July 1989 approached, the control system was not ready and a small team was recruited to implement the bare minimum controls required to inject beam and ramp up the energy. Unable to hone key parameters such as the tune and orbit corrections before beam was injected, we had two major concerns: is the beam aperture clear of all obstacles, and are there any polarity errors in the connections of the many thousand magnetic elements? So we nominated a “Mr Polarity”, whose job was to check all polarities in the ring. This may sound trivial, but with thousands of connections it was a huge task.
LEP’s beam-energy resolution was so precise that it was possible to observe distortion of the 27 km ring by a single millimetre, whether due to the tidal forces of the Sun and Moon, or the seasonal distortion caused by rain and meltwater from the nearby mountains filling up Lac Léman and weighing down one side of the ring. In 1993 we noticed even more peculiar random variations on the energy signal during the day – with the exception of a few hours in the middle of the night when the signal was noise free. Everybody had their own pet theory. I believed it was some sort of effect coming from planes interacting with the electrical supply cables. Some nights later I could be seen sitting in a car park on the Jura at 2 a.m., trying to prove my theory with visual observations, but it was very dark and all the planes had stopped landing several hours beforehand. Experiment inconclusive! The real culprit, the TGV (a high-speed train), was discovered by accident a few weeks later during a discussion with a railway engineer: leakage currents on the French rail track flowed through the LEP vacuum chamber with the return path via the Versoix river back to Cornavin. The noise hadn’t been evident when we first measured the beam energy as TGV workers had been on strike.
At a quarter to midnight on 14 July 1989, the aperture was free of obstacles and the beam made its first turn on our first attempt. Soon afterwards we managed to achieve a circulating beam, and we were ready to fine tune the multitude of parameters needed to prepare the beams for physics.
The goal for the first phase of LEP was electron–positron collisions at a total energy of 91 GeV – the mass of the neutral carrier of the weak force, the Z boson. LEP was to be a true Z factory, delivering millions of Zs for precision tests of the Standard Model. To mass-produce them required beams not only of high energy but also of high intensity, and delivering them required four steps. The first was to accumulate the highest possible beam current at 20 GeV – the injection energy. This was a major operation in itself, involving LEP’s purpose-built injection linac and electron–positron accumulator, the Proton Synchrotron, the Super Proton Synchrotron (SPS) and, finally, transfer lines to inject electrons and positrons in opposite directions – these curved not only horizontally but also vertically as LEP and the SPS were at different heights. The second step was to ramp up the accumulated current to the energy of the Z resonance with minimal losses. Thirdly, the beam had to be “squeezed” to improve the collision rate at the interaction regions by changing the focusing of the quadrupoles on either side of the experiments, thereby reducing the transverse cross section of the beam at the collision points.
Following the highly successful first turn on 14 July 1989, we spent the next month preparing for the first physics run. Exactly a month later, on 13 August, the beams collided for the first time. The following 10 minutes seemed like an eternity since none of the four experiments – ALEPH, DELPHI, L3 and OPAL – reported any events. I was in the control room with Emilio Picasso and we were beginning to doubt that the beams were actually colliding when Aldo Michelini called from OPAL with the long-awaited comment: “We have the first Z0!” ALEPH and OPAL physicists had connected the Z signal to a bell that sounded on the arrival of the particle in their detectors. While OPAL’s bell rang proudly, ALEPH’s was silent, leading to a barrage of complaints before it became apparent that they were waiting for the collimators to close before turning on their sub detectors. As the luminosity rose during the subsequent period of machine studies the bells became extremely annoying and were switched off.
The first physics run began on 20 September 1989, with LEP’s total energy tuned for five days to the Z mass peak at 91 GeV, providing enough integrated luminosity to generate 1400 Zs in each experiment. A second period followed, this time with the energy scanned through the width of the Z at five different beam energies: at the peak and ±1 GeV and ±2 GeV to either side, allowing the experiments to measure the width of the Z resonance. First physics results were announced on 13 October, just three months after the final testing of the accelerator’s components (see LEP’s electroweak leap).
LEP dwelt at the Z peak from 1989 to 1995, during which time the four experiments each observed approximately 4.5 million Z decays. In 1995 a major upgrade dubbed LEP2 saw the installation of 288 superconducting cavities (figure 2), enabling LEP to sit at or near the WW threshold of 161 GeV for the following five years. The maximum beam energy reached was 104.4 GeV. There was also a continuous effort to increase the luminosity by increasing the number of bunches, reducing the emittance by adjusting the focusing, and squeezing the bunches more tightly at the interaction points, with LEP’s performance ultimately limited by the nonlinear forces of the beam–beam interaction – the perturbations of the beams as they cross the opposing beam. LEP surpassed every one of its design parameters (figure 3).
Life as a LEP accelerator physicist
Being an accelerator physicist at LEP took heart as well as brains. The sisyphean daily task of coaxing the seemingly temperamental machine to optimal performance even led us to develop an emotional attachment to it. Challenges were unpredictable, such as for the engineers dispatched on a fact-finding mission to ascertain the cause of an electrical short circuit, only to discover two deer, “Romeo and Juliet”, locked in a lover’s embrace having bitten through a cable, or the discovery of sabotage with beer bottles (see The bizarre episode of the bottles in the beampipe). The aim, however, was clear: inject as much current as possible into both beams, ramp the energy up to 45 GeV, squeeze the beam size down at the collision points, collide and then spend a few hours delivering events to the experiments. The reality was hours of furious concentration, optimisation, and, in the early days, frustrating disappointment.
In the early years, filling LEP was a delicate hour-long process of parameter adjustment, tweaking and coaxing the beam into the machine. On a good day we would see the beam wobble alarmingly on the UV telescopes, lose a bit and watch the rest struggle up the ramp. On a bad day, futile attempt after futile attempt, most of the beam would disappear without warning in the first few seconds of the ramp. The process used to last minutes and there was nothing you could do. We would stand there, watching the lifetime buck and dip, and the painstakingly injected beam would either slowly or quickly drift out of the machine. The price of failure was a turn around and refill. Success brought the opportunity to chance the squeeze – an equally hazardous manoeuvre whereby the interaction-point focusing magnets were adjusted to reduce the beam size – and then perhaps a physics fill, and a period of relative calm. At this stage the focus would move to the experimental particle physicists on shift at the four experiments. Each had their own particular collective character, and their own way of dealing with us. We verged between being accommodating, belligerent, maverick, dedicated, professional and very occasionally hopelessly amateur – sometimes all within the span of a single shift, depending on the attendant pressures.
The experiment teams paraded their operational efficiency numbers – plus complaints or congratulations – at twice weekly scheduling meetings. Well run and disciplined, ALEPH almost always had the highest efficiency figures; their appearances at scheduling meetings nearly always a simple statement of 97.8% or thereabouts. This was livened in later years by the repeated appearance of their coordinator Bolek Pietrzyk, who congratulated us each time we stepped up in energy or luminosity with a strong, Polish-accented, “Congratulations! You have achieved the highest energy electron–positron collisions in the universe!”, which was always gratifying. Equally professional, but more relaxed, was OPAL, which had a strong British and German contingent. These guys understood human nature. Quite simply, they bribed us. Every time we passed a luminosity target or hit a new energy record they’d turn up in the control room with champagne or crates of German beer. Naturally we’d do anything for them, happily moving heaven and earth to resolve their problems. L3 and DELPHI had their own quirks. DELPHI, for example, ran their detector as a “state machine”, whose status changed automatically based on signals from the accelerator control room. All well and good, but they depended on us to change the mode to “dump beam” at the end of a fill, something that was occasionally skipped, leaving DELPHI’s subdetectors on and them ringing us desperately for a mode change. Baffled DELPHI students on shift would ask what was going on. Filling and ramping were demanding periods during the operational sequence and a lot of concentration was required. The experiment teams did well not to ring and make too many demands at this stage – requests were occasionally rebuffed with a brusque response.
On the verge of a great discovery?
LEP’s days were never fated to dwindle. Early on, CERN had a plan to install the LHC in the same tunnel, in a bid to scan ever higher energies and be the first to discover the Higgs boson. However, on 14 June 2000, LEP’s final year of scheduled running, the ALEPH experiment reported a possible Higgs event during operations at a centre-of-mass energy of 206.7 GeV. It was consistent with “Higgs-strahlung”, whereby a Z radiates a Higgs boson, which was expected to dominate Higgs-boson production in e+e– collisions at LEP2 energies. On 31 July and 21 August ALEPH reported second and third events corresponding to a putative reconstructed Higgs mass in the range 114–115 GeV.
The bizarre episode of the bottles in the beampipe
The story of the sabotage of LEP has grown in the retelling, but I was there in June 1996, hurrying back early from a conference to help the machine operators, who had been struggling to circulate a beam for several days. After exhausting other possibilities, it became clear that there was an obstruction in the vacuum pipe, and we detected the location using the beam position system. It appeared to be around point 1 (where ATLAS now sits), so we opened the vacuum seal and took a look inside the beampipe using mirrors and endoscopes. Not seeing anything, I frustratedly squeezed my head between the vacuum flanges and peered down inside the pipe. In the distance was something resembling a green concave lens. “This looks like the bottom of a beer bottle,” I thought, restraining myself from uttering a word to anyone in the vicinity. I went to the opposite open end of the vacuum section and peered into the vacuum pipe again: a green circular disk this time, but again, not a word. Someone got a long pole to poke out the offending article – out it came, and my guess was correct: it was a Heineken beer bottle, which had indeed refreshed the parts no other beer could reach, as the slogan ran. A hasty search revealed a second bottle. Upon closer inspection it was clear that the control room operators had almost succeeded in making the beam circulate despite the obstacles: there was a scorch burn along the label, indicating that they had almost managed to steer the beam past the bottles. If there had only been one they may have succeeded. The Swiss police interviewed me concerning this act of sabotagebut the culprit was never unmasked.
LEP was scheduled to stop in mid-September with two weeks of reserve time granted to the LEP experiments to see if new Higgs-like events would appear. After the reserve weeks, ALEPH requested two months more running to double its integrated luminosity. One was granted, yielding a 50% increase in the accumulated data, and ALEPH presented an update of their results on 10 October: the signal excess had increased to 2.6σ. Things were really heating up, and on 16 October L3 announced a missing-energy candidate. By now the accelerator team was pushing LEP to its limits, to squeeze out every ounce of physics data in the service of the experiments’ search for the elusive Higgs. At the LEP committee meeting on 3 November, ALEPH presented new data that confirmed their excess once again – it had now grown to 2.9σ. A request to extend LEP running by one year was made to the LEPC. There was gridlock, and no unanimous recommendation could be made.
All of CERN was discussing the proposed running of LEP in 2001 to get final evidence of a possible discovery of the Higgs boson. Arguments against included delays to the start of the LHC of up to three years. There was also concern that Fermilab’s Tevatron would beat the LHC to the discovery of the Higgs, and mundane but practical arguments about the transfer of human resources to the LHC and the impact on the materials budget, including electricity costs. The impending closure of LEP, when many of us thought we were about to discover the Higgs, was perceived like the death of a dear friend by most of the LEP-ers. After each of the public debates on the subject a group of us would meet in some local pub, drink a few beers, curse the disbelievers and cry on each other’s shoulders. This was the only “civil war” that I saw in my 43 years at CERN.
The CERN research board met again on 7 November and again there was deadlock, with the vote split eight votes to eight. The next day, then director-general Luciano Maiani announced that LEP had closed for the last time. It was a deeply unpopular decision, but history has shown it to be correct: the Higgs was discovered at the LHC 12 years later, with a mass of not 115 but 125 GeV. LEP’s closure allowed a massive redeployment of skilled staff, and the experience gained for the first time in running large accelerators went on to prove essential to the safe and efficient operation of the LHC.
When LEP was finally laid to rest we met one last time for an official wake (figure 4). After the machine was dismantled, requiring the removal to the surface of around 30,000 tonnes of material, some of the magnets and RF units were shipped to other labs for use in new projects. Today, LEP’s concrete magnet casings can still be seen scattered around CERN as shielding units for antimatter and fixed-target experiments, and even as road barriers.
LEP was the highest energy e+e– collider ever built. Its legacy was and is extremely important for present and future colliders. The quality and precision of the physics data remain unsurpassed in luminosity, energy and energy calibration. It is the reference for any future e+e–-ring collider design.
In 2012 the CERN management asked a question: what is the largest circular machine that could be feasibly constructed in the Geneva region from a civil-engineering perspective? Teams quickly embarked on an extensive investigation of the geological, environmental and technical constraints in pursuit of the world’s largest accelerator. Such a machine would be the next logical step in exploring the universe at ever smaller scales.
Since construction of the 27 km circumference Large Hadron Collider (LHC) was completed in 2005, CERN has been looking at the potential layouts for the tunnels that will house the next generation of particle accelerators. The Compact Linear Collider (CLIC) and the Future Circular Collider (FCC) are the two largest projects under consideration. With a circumference of 100 km, the FCC will require one of the world’s largest tunnels – almost twice as long as the recently completed 57 km Gotthard Base Tunnel in the Swiss Alps. Designing large infrastructure like the FCC tunnel requires the collection and interpretation of numerous data, which have to be balanced for the optimum level of risk, cost and project requirements.
The first and most important task in designing tunnels is to understand the needs and requirements of the users. For road or rail tunnels, this is relatively straightforward. For a cutting-edge scientific experiment, multi-disciplinary working groups are needed to identify the key criteria. The diameter of a new tunnel depends on what components would be inside – ventilation systems, magnets, lighting, transport corridors, etc – so they can fit in like a jigsaw.
Bespoke designs
Unlike other tunnelling projects, there are no standard rules or guidance for the design of particle-accelerator tunnels, meaning each design is, to a large extent, bespoke. One reason for this is the sensitivity of the equipment inside. Digging a 5.6 m-diameter hole disturbs rock that has been there for millennia, causing it to relax and to move. Modern tunnelling techniques can control these movements and get a tunnel to within a few centimetres of its intended design. For example, the two ends of the 27 km LEP ring came together with just 1 cm of error. It would be impossible to achieve the nanometre-level tolerances that the beamline requires, so the sensitive equipment installed in a completed accelerator tunnel must incorporate adjustable alignment systems into their designs.
The city of Geneva sits on a large plateau between the Jura and Prealps mountains. The bedrock of the plateau is a competent (resistant to deformation) sedimentary rock, called molasse, which formed when eroded material was deposited and consolidated in a basin as the Alps lifted up. On top of the molasse sits a softer soil, called the moraines, which is made up of more recent, unconsolidated glacial deposits. The Jura itself is made of limestone rock, which while competent, is soluble and can form a network of underground voids, known as karsts.
We can never fully understand the ground before we start tunnelling and there is always the risk of encountering something unexpected, such as water, faults or obstructions. These cost money to overcome and/or delay the project; in the worst cases, they may even cause the tunnel to collapse. To help mitigate these risks and provide technical information for the tunnel design, we investigate the ground in the early stages of the project by drilling boreholes and testing ground samples. Like most things in civil engineering, however, there is a balance between the cost of the investigations versus the risks they mitigate. No boreholes have been sunk specifically for FCC yet, but we have access to a substantial amount of data from the LHC and from the Swiss and French authorities.
The answer to CERN’s question in 2012 was that a (quasi-)circular tunnel up to 100 km long could be built near Geneva (figure 1). This will be confirmed with further site investigations to verify the design assumptions and optimise a layout for the new machine. The FCC study considers two potential high-energy accelerators: hadron–hadron and electron–positron, and the FCC would consist of a series of arcs and straight sections (figure 2). Depending on the choice of a future collider, civil-engineering designs for FCC and/or CLIC will need to be developed further. Although the challenges between the two studies differ, the processes and tools used will be similar.
Optimising the alignment
Having determined the FCC’s feasibility, CERN’s civil engineers started designing the optimal route of the tunnel. Geology and topography are the key constraints on the tunnel position. Two alignment options were under consideration in 2012, both 80 km long, one located under the Jura Mountains and the other in the Geneva basin. When the FCC study officially kicked off in 2014, they were reviewed alongside a 47 km-circumference option fully excavated in the molasse.
Experience of tunnelling through Jura limestone during construction of the Large Electron Positron collider (LEP; from which the LHC inherited many of its tunnels) convinced civil engineers to discard the Jura option. Mining through the karstic limestone caused several delays and costly repairs after water and sediment flowed into the tunnel (see The greatest lepton collider). To this day, intensive maintenance works are needed between sectors 3 and 4 of the LHC tunnel and this has led to machine shutdowns lasting as long as two weeks.
By 2016, the proposed length of the FCC had increased to between 80 and 100 km to achieve higher energies with two alignments under consideration: intersecting (which crosses the LHC in plan view) and non-intersecting. The former is the current baseline design. The tunnel is located primarily in the competent molasse rock and avoids the problematic Jura limestone and the Prealps. However, it does pass through the Mandallaz limestone formation and also has to cross under Lake Geneva. To deal with the wealth of topographical, geological and environmental data relevant for a 100 km ring, CERN embarked on an innovative tunnel optimisation tool (TOT) that would let us assess a multitude of alignment options in a fraction of the time (see CERN’s tunnel optimisation tool).
In 2014, with the help of UK-based engineering consultancy Arup, CERN developed the tunnel optimisation tool (TOT) to integrate project requirements and data into a geospatial model.The web-based tool allows the user to digitally move the FCC tunnel, change its size, shape and depth and see, in real-time, the impacts of the changes on the design. Geology, surface constraints and environmentally protected areas are visualised, and parameters such as plane inclinations and tunnel depth can be changed at the click of a mouse. The tool warns users if certain limits are exceeded or obstacles are encountered, for example, if a shaft is in the middle of Lake Geneva! When it was built, TOT was the first of its kind within the industry. It has cut the cost of the civil-engineering design and has provided us with the flexibility to meet changing requirements to ultimately deliver a better project. The success of TOT led to its replication for CLIC and the International Linear Collider (ILC) under consideration in Japan. Recently, a TOT was built by Arup to quickly and cheaply assess a range of alignments for a 3 km tunnel under the ancient Stonehenge heritage site in the UK.
The alignment of the FCC tunnel has been optimised based on three key criteria at this stage: geology (building in the competent molasse rock wherever possible); shaft depth (minimising the depth of shafts); and surface sites (choosing locations that minimise disruption to residents and the environment).
Despite the best efforts to avoid the risky Jura Mountains, the geology is not perfect. The Prealps region has complex, faulted geology and it is uncertain which layers the tunnel will cross. Cracks or faults, caused by tectonic movements of the Alps and Jura, can occur in the molasse and limestone. Excavation through Mandallaz limestone can lead to similar issues encountered during LEP’s construction. Large, high-pressure inflows can be difficult to remedy, expensive and can create delays in the programme.
To minimise the depth of the shafts, the entire FCC ring sits in an inclined plane with different heights above sea level around the tunnel. Modelling a range of alignment options at different locations and with different tunnel inclinations, constrained by the spacing requirements of the experiments, it turned out that one shaft was 558 m deep in the baseline design. The team therefore decided to replace the vertical shaft with an inclined tunnel (15% slope) to pop out the side of the mountain.
The presence of Lake Geneva influences the overall depth of the FCC, and the tunnel optimisation tool tells us that it isn’t possible to avoid tunnelling under the lake within the study boundary. Modern tunnelling techniques open up different options for crossing the lake, instead of simply digging deeper until we reach the rock (figure 3). Several options were considered, even including an option to build a hybrid particle accelerator-road tunnel in an immersed tube tunnel (which was later scrapped because of potential vibrations caused by traffic disrupting the beamline). The current design compromises on a mid-depth tunnel passing through the permeable moraines on the lake bed.
At the bottom of some of the FCC shafts are large experimental caverns with spans of up to 35 m. To determine the best arrangement for experimental and service caverns, Amberg Engineering carried out a stress analysis (figure 4). Although for data-acquisition purposes it is often desirable to have the two caverns as close as possible to each other, the analysis showed that it would be prohibitively expensive to build a 10 m concrete wall between the caverns. The cheaper option is to use the existing rock as a natural pillar, which would require a minimum spacing of 45 m.
Tunnelling inevitably disturbs the surrounding area. The beamline of the LHC is incredibly sensitive and can detect even the smallest vibrations from the outside world. This was a potential issue for construction works currently taking place for the High-Luminosity LHC project. The contractor had to improvise and modify a standard diesel excavator with an electric motor to eliminate vibrations from the engine. The programme was also adapted so that only the shafts were constructed during operation of the LHC, leaving the more disruptive cavern construction until the start of the current shutdown.
Securing the future
CERN currently has 83 km of underground structures. The FCC would add over 100 km of tunnels, 3720 m of shafts, 26 caverns (not including junction caverns), 66 alcoves and with up to 30 km between the Meyrin campus and the furthest site. The estimated civil-engineering cost for FCC (carried out by ILF Consulting Engineers) is approximately 6 billion Swiss Francs – 45% for tunnels and the rest for shafts, caverns and surface facilities – and benefits from significant advances in tunnelling technology since the LEP-tunnel days (see Advances in civil engineering since the LEP days).
It has been almost 35 years since three tunnel boring machines (TBMs) set off to carve out the 27 km-long hole that would house LEP and, later, the LHC. Contrary to the recent claims of tech entrepreneur Elon Musk, the technology used to construct modern tunnels has been quietly and rapidly advancing since the construction of LEP, providing a faster, safer and more versatile way to build tunnels. TBMs act as a mobile factory that simultaneously excavates rock from the face and builds a tunnel lining from prefabricated segments behind it. The outer shield of the machine protects workers from falling rock, making sure they are never working in unsupported ground.
One of the main advances in TBM technology is their ability to cope in variable ground conditions. Most of the LEP tunnels were constructed in dry, competent rock, meaning the excavation face needed little support to stand up. Underneath the Jura Mountains, however, pockets of water and soil form where the limestone dissolves into karsts. When a TBM hits this, the water can flow into the tunnels, causing flooding and, at worst, tunnel collapse. Modern TBMs come with a variety of face-support measures, including earth-pressure balance machines that use the excavated soil to push back against the excavated face for support. Herrenknecht’s Mixshield TBM (above) could be used to tunnel the FCC under Lake Geneva, where water-bearing moraines are encountered.
Segmental linings can be constructed off-site in a factory, improving quality, speed and safety. The segments are assembled in the rear of the TBM immediately after excavation. The segments can be fitted with a rubber gasket, which provides a waterproof seal, eliminating the need for the traditional secondary lining. Across the 100 km of the FCC, this will lead to substantial cost savings.
Seismic and sonic scanners can be mounted to the front of the TBM, allowing operators to detect voids or obstacles up to 40 m ahead and adjust their approach accordingly. Probe drilling and pre-support measures can also be implemented from within the machine, meaning that the mining crew is safe and minimising delays to the construction programme.
For vertical shafts, the vertical shaft sinking machine and shaft boring machine are the latest technological breakthroughs, taking all the technology of a TBM and standing it on its end. The giant rig hangs off a crane and excavates below the platform, whilst building a lining above it. The machine can even work underwater to stabilise the shafts during construction.
Traditional tunnelling techniques, which are useful for creating non-standard shapes or smaller tunnels like the experimental caverns in FCC, have come a long way, too. These aren’t the normal sticks of dynamite you see in films or cartoons – highly stable explosives are slotted precisely in holes using a giant rig with multiple arms for speed. The electric detonators can be configured to the millisecond for complex patterns of explosions that give tunnellers precise control of the shape, speed and quality of the excavation.
The safety of the underground areas is critical to ensure the safe and continued operation of the experiments, and CERN has developed advanced tools to inspect the structures – some of which are more than 60 years old. Manually inspecting the condition of the structures on the scale of the FCC will become extremely challenging. We are therefore developing new technologies that will allow us to monitor the condition of the tunnels remotely. Currently, teams are testing out how fibre-optic cables can be attached to the concrete linings to measure movements over time, and developing and training algorithms to be able to spot and characterise faults in the tunnel lining. In the future, the software will be able to measure these faults and compare the changes with previous inspections to assess how they have progressed. To capture these images, a Tunnel Inspection Machine, which runs on the monorail in the roof of the LHC, and a floor-roving inspection robot have both been tested to collect images and data, even when the tunnel is not safe for humans. These images can be rebuilt in a 3D environment and viewed through a virtual-reality headset.
Projects like the FCC and CLIC are not just exciting for physicists. For civil engineers they represent challenges that demand new ideas and technology. At the annual World Tunnel Congress, attended by more than 2000 leading tunnel and underground-space experts, CERN’s FCC has already generated great interest. If approved, it would require the largest construction projects science has ever seen, bequeathing a tunnel that would serve fundamental exploration into the next century.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.