The sixth Cosmology and the Quantum Vacuum conference attracted about 60 theoreticians to the Institute of Space Sciences in Barcelona from 5 to 7 March. This year the conference marked Spanish theorist Emilio Elizalde’s 70th birthday. He is a well known specialist in mathematical physics, field theory and gravity, with over 300 publications and three monographs on the Casimir effect and zeta regularisation. He has co-authored remarkable works on viable theories of modified gravity which unify inflation with dark energy.
These meetings bring together researchers who study theoretical cosmology and various aspects of the quantum vacuum such as the Casimir effect. This quantum effect manifests itself as an attractive force which appears between plates which are extremely close to each other. As it is related to the quantum vacuum, it is expected to be important in cosmology as well, giving a kind of effective induced cosmological constant. Manuel Asorey (Zaragoza), Mike Bordag (Leipzig) and Aram Saharian (Erevan) discussed various aspects of the Casimir effect for scalars and for gauge theories. Joseph Buchbinder gave a review of one-loop effective action in supersymmetric gauge theories. Conformal quantum gravity and quantum electrodynamics in de Sitter space were presented by Enrique Alvarez (Madrid) and Drazen Glavan (Brussels), respectively.
Enrique Gaztanaga argued for two early inflationary periods
Even more attention was paid to theoretical cosmology. The evolution of the early and/or late universe in different theories of modified gravity was discussed by several delegates, with Enrique Gaztanaga (Barcelona) expressing an interesting point of view on the inflationary universe, arguing for two early inflationary periods.
Martiros Khurshyadyan and I discussed modified-gravity cosmology with the unification of inflation and dark energy, and wormholes, building on work with Emilio Elizalde. Wormholes are usually related with exotic matter, however they may in alternative gravity be caused by modifications to the gravitational equations of motion. Iver Brevik (Trondheim) gave an excellent introduction to viscosity in cosmology. Rather exotic wormholes were presented by Sergey Sushkov (Kazan), while black holes in modified gravity were discussed by Gamal Nashed (Cairo). A fluid approach to the dark-energy epoch and the addition of four forms (antisymmetric tensor fields with four indices) to late universe evolution was given by Diego Saez (Vallodolid) and Mariam Bouhmadi-Lopez (Bilbao), respectively. Novel aspects of non-standard quintessential inflation were presented by Jaime Haro (Barcelona).
Many interesting talks were given by young participants at this meeting. The exchange of ideas between cosmologists on the one side and quantum-field-theory specialists on the other will surely help in the further development of rigorous approaches to the construction of quantum gravity. It also opens the window onto a much better account of quantum effects in the history of the universe.
In a former newspaper printing plant in the southern Dutch town of Maastricht, the future of gravitational-wave detection is taking shape. In a huge hall, known to locals as the “big black box”, construction of a facility called ETpathfinder has just got under way, with the first experiments due to start as soon as next year. ETpathfinder will be a testing ground for the new technologies needed to detect gravitational waves in frequency ranges that the present generation of detectors cannot cover. At the same time, plans are being developed for a full-scale gravitational-wave detector, the Einstein Telescope (ET), in the Dutch–Belgian–German border region. Related activities are taking place 1500 km south in the heart of Sardinia, Italy. In 2023, one of these two sites (which have been selected from a total of six possible European locations) will be selected as the location of the proposed ET.
In 2015, the Laser Interferometer Gravitational-Wave Observatory (LIGO), which is based at two sites in the US, made the first direct detection of a gravitational wave. The Virgo observatory near Pisa in Italy came online soon afterwards, and the KAGRA observatory in Japan is about to become the third major gravitational-wave observatory in operation. All are L-shaped laser interferometers that detect relative differences in light paths between mirrors spaced far apart (4 km in LIGO; 3 km in Virgo and KAGRA) at the ends of two perpendicular vacuum tubes. A passing gravitational wave changes the relative path lengths by as little as one part in 1021, which is detectable via the interference between the two light paths. Since 2015, dozens of gravitational waves have been detected from various sources, providing a new window onto the universe. One event has already been linked to astronomical observations in other channels, marking a major step forward in multi-messenger astronomy (CERN Courier December 2017 p17).
Back in time
The ET would be at least 10 times more sensitive than Advanced LIGO and Advanced Virgo, extending its scope for detections and enabling physicists to look back much further in cosmological time. For this reason, the interferometer has to be built at least 200 m underground in a geologically stable area, its mirrors have to operate in cryogenic conditions to reduce thermal disturbance, and they have to be larger and heavier to allow for a larger and more powerful laser beam. The ET would be a triangular laser interferometer with sides of 10 km and four ultra-high vacuum tubes per tunnel. The triangle configuration is equivalent to three overlapping interferometers with two arms each, allowing sources in the sky to be pinpointed via triangulation from just one location instead of several as needed by existing observatories. First proposed more than a decade ago and estimated to cost close to €2 billion, the ET, if approved, is expected to start looking at the sky sometime in the 2030s.
“In the next decade we will implement new technologies in Advanced Virgo and Advanced LIGO, which will enable about a factor-two increase in sensitivity, gaining in detection volume too, but we are reaching the limits of the infrastructure hosting the detectors, and it is clear that at a certain point these will strongly limit the progress you can make by installing new technologies,” explains Michele Punturo of INFN Perugia, who is co-chair of the international committee preparing the ET proposal. “The ET idea and its starting point is to have a new infrastructure capable of hosting further and further evolutions of the detectors for decades.”
Belgian, Dutch and German universities are investing heavily in the ETpathfinder project, which is also funded by European Union budgets for interregional development, and are considering a bid for the ET in the flowing green hills of the border region around Vaals between Maastricht (Netherlands) and Luik (Belgium). A geological study in September 2019 concluded that the area has a soft-soil top layer that provides very good environmental noise isolation for a detector built in granite-like layers 200 m below. Economic studies also show a net benefit, both regional and national, from the high-tech infrastructure the ET would need. But even if ET is not built there, ETpathfinder will still be essential to future gravitational-wave detection, stresses project leader Stefan Hild of Maastricht University. “This will become the testing ground for the disruptive technologies we will need in this field anyway,” he says.
ET in search of home
ETpathfinder is a research infrastructure, not a scale model for the future ET. Its short length means that it is not aimed at detecting gravitational waves at any point in time. The L-shaped apparatus (“Triangulating for the future” image) has two arms about 20 m long, with two large steel suspension towers each containing large mirrors. The arms meet in a central fifth steel optical tower and one of the tubes extends behind the central tower, ending in a sixth tower. The whole facility will be housed in a new climate-controlled clean room inside the hall, and placed on a new low-vibration concrete floor. ETpathfinder is not a single interferometer but consists of two separate research facilities joined at one point for shared instrumentation and support systems. The two arms could be used to test different mirrors, suspensions, temperatures or laser frequencies independently. Those are the parameters Hild and his team are focusing on to further reduce noise in the interferometers and enhance their sensitivity.
Deep-cooling the mirrors is one way to beat noise, says Hild. But it also brings huge new challenges. One is that thermal conductivity of silica glass is not perfect at deep cryogenic temperatures, leading to deformations due to local laser heating. For that reason, pure silicon has to be used, but silicon is not transparent to the conventional 1064 nm laser light used for detecting gravitational waves and to align the optical systems in the detector. Instead, a whole new laser technology at 1550 nm will have to be developed and tested, including fibre-laser sources, beam control and manipulation, and specialised low-noise sensors. “All these key technologies and more need testing before they can be scaled up to the 10 km scales of the future ET,” says Hild. Massive mirrors in pure silicon of metre-sizes have never been built, he points out, nor have silicon wire suspensions for the extreme cold payloads of more than half a tonne. Optoelectronics and sensors at 1550 nm at the noise level required for gravitational-wave detectors are also non-standard.
On paper, the new super-low noise detection technologies to be investigated by ETpathfinder will provide stunning new ways of looking at the universe with the ET. The sensitivity at low frequencies will enable researchers to actually hear the rumblings of space–time hours before spiralling black holes or neutron stars coalesce and merge. Instead of astronomers struggling to point their telescopes at the point in the sky indicated by millisecond chirps in LIGO and Virgo, they will be poised to catch the light from cosmic collisions many billions of light years away.
The Archimedes experiment, which will be situated under 200 m of rock at the Sar-Grav laboratory in the Sos Enattos mine in Sardinia, was conceived in 2002 to investigate the interaction between the gravitational field and vacuum fluctuations. Supported by a group of about 25 physicists from Italian institutes and the European Gravitational Observatory, it is also intended as a “bridge” between present- and next-generation interferometers. A separate project in the Netherlands, ETpathfinder, is performing a similar function (see main text).
Quantum mechanics predicts that the vacuum is a sea of virtual particles which contribute an energy density – although one that is tens of orders of magnitude larger than what is observed. Archimedes will attempt to shed light on the puzzle by clarifying whether virtual photons gravitate or not, essentially testing the equivalent of Archimedes’ principle in vacuum. “If the virtual photons do gravitate then they must follow the gravitational field around the Earth,” explains principal investigator Enrico Calloni of the University of Naples Federico II. “If we imagine removing part of them from a certain volume, creating a bubble, there will be a lack of weight (and pressure differences) in that volume, and the bubble will sense a force directed upwards, similar to the Archimedes force in a fluid. Otherwise, if they do not gravitate, the bubble will not experience any variation in the force even being immersed in the gravitational field.”
The experiment (pictured) will use a Casimir cavity comprising two metallic plates placed a short distance apart so that virtual photons that have too large a wavelength cannot survive and are expelled, enabling Archimedes to measure a variation of the “weight” of the quantum vacuum. Since the force is so tiny, the measurement must be modulated and performed at a frequency where noise is low, says Calloni. This will be achieved by modulating the vacuum energy contained in the cavity using plates made from a high-temperature superconductor, which exhibits transitions from a semiconducting to superconducting state and in doing so alters the reflectivity of the plates. The first prototype is ready and in March the experiment is scheduled to begin six years of data-taking. “Archimedes is a sort of spin-off of Virgo, in the sense that it uses many of the technologies learned with Virgo: low frequency, sensors. And it has a lot of requirements in common with third-generation interferometers like ET: cryogenics and low seismic noise, first and foremost,” explains Calloni. “Being able to rely on an existing lab with the right infrastructure is a very strong asset for the choice of a site for ET.”
Sardinian adventure
The Sos Enattos mine is situated in the wild and mountainous heart of Sardinia, an hour’s drive from the Mediterranean coast. More than 2000 years ago, the Romans (who, having had a hard time conquering the land, christened the region “Barbaria”) excavated around 50 km of underground tunnels to extract lead for their aqueduct pipes. Until it closed activity in 1996, the mine has been the only alternative to livestock-rearing in this area for decades. Today, the locals are hoping that Sos Enattos will be chosen as the site to host the ET. Since 2010, several underground measurement campaigns have been carried out to characterise the site in terms of environmental noise. The regional government of Sardinia is supporting the development of the “Sar-Grav” underground laboratory and its infrastructures with approximately €3.5 million, while the Italian government is supporting the upgrade of Advanced Virgo and the characterisation of the Sos Enattos site with about €17 million, as part of a strategy to make Sardinia a possible site for the ET.
Sar-Grav’s control room was completed late last year, and its first experiment – Archimedes – will soon begin (see “Archimedes weighs in on the quantum vacuum” panel), with others expected to follow. Archimedes will measure the effect of quantum interactions with gravity via the Casimir effect and, at the same time, provide a testbed to verify the technologies needed by a third-generation gravitational-wave interferometer such as the ET. “Archimedes has the same requirements as an underground interferometer: extreme silence, extreme cooling with liquid nitrogen, and the ensuing safety requirements,” explains Domenico D’Urso, a physicist from the University of Sassari and INFN.
Follow the noise
Sardinia is the oldest land in Italy and the only part of the country without significant seismic risk. The island also has a very low population density and thus low human activity. The Sos Enattos mine has very low seismic noise and the most resistant granitic rock, which was used until the 1980s to build the skyscrapers of Manhattan. Walking along the mine’s underground tunnels – past the Archimedes cavern, amidst veins of schist, quartz, gypsum and granite, ancient mining machines and giant portraits of miners bearing witness to a glorious past – an array of instruments can be seen measuring seismic noise; some of which are so sensitive that they are capable of recording the sound of waves washing against the shores of the Thyrrenian sea. “We are talking about really small sensitivities,” continues Domenico. “An interferometer needs to be able to perform measurements of 10–21, otherwise you cannot detect a gravitational wave. You have to know exactly what your system is doing, follow the noise and learn how to remove it.”
With the Einstein Telescope, we have 50 years of history ahead
The open European ET collaboration will spend the next two years characterising both the Sardinian and Netherlands sites, and then choosing which best matches the required parameters. In the current schedule, a technical design report for the ET would be completed in 2025 and, if approved, construction would take place from 2026 with first data-taking during the 2030s. “As of then, wherever it is built, ET will be our facility for decades, because its noise will be so low that any new technology that at present we cannot even imagine could be implemented and not be limited,” says Punturo, emphasising the scientific step-change. Current detectors can see black-hole mergers occurring at a redshift of around one when the universe was six billion years old, Punturo explains, while current detectors at their final sensitivity will achieve a redshift of around two, corresponding to three billion years after the Big Bang. “But we want to observe the universe in its dark age, before stars existed. To do so, we need to increase sensitivity to a redshift tenfold and more,” he says. “With ET, we have 50 years of history ahead. It will study events from the entire universe. Gravitational waves will become a common tool just like conventional astronomy has been for the past four centuries.”
Ten years have passed since the first high-energy proton–proton collisions took place at the Large Hadron Collider (LHC). Almost 20 more are foreseen for the completion of the full LHC programme. The data collected so far, from approximately 150 fb–1 of integrated luminosity over two runs (Run 1 at a centre-of-mass energy of 7 and 8 TeV, and Run 2 at 13 TeV), represent a mere 5% of the anticipated 3000 fb–1 that will eventually be recorded. But already their impact has been monumental.
Three major conclusions can be drawn frofm these first 10 years. First and foremost, Run 1 has shown that the Higgs boson – the previously missing, last ingredient of the Standard Model (SM) – exists. Secondly, the exploration of energy scales as high as several TeV has further consolidated the robustness of the SM, providing no compelling evidence for phenomena beyond the SM (BSM). Nevertheless, several discoveries of new phenomena within the SM have emerged, underscoring the power of the LHC to extend and deepen our understanding of the SM dynamics, and showing the unparalleled diversity of phenomena that the LHC can probe with unprecedented precision.
Exceeding expectations
Last but not least, we note that 10 years of LHC operations, data taking and data interpretation, have overwhelmingly surpassed all of our most optimistic expectations. The accelerator has delivered a larger than expected luminosity, and the experiments have been able to operate at the top of their ideal performance and efficiency. Computing, in particular via the Worldwide LHC Computing Grid, has been another crucial driver of the LHC’s success. Key ingredients of precision measurements, such as the determination of the LHC luminosity, or of detection efficiencies and of backgrounds using data-driven techniques beyond anyone’s expectations, have been obtained thanks to novel and powerful techniques. The LHC has also successfully provided a variety of beam and optics configurations, matching the needs of different experiments and supporting a broad research programme. In addition to the core high-energy goals of the ATLAS and CMS experiments, this has enabled new studies of flavour physics and of hadron spectroscopy, of forward-particle production and total hadronic cross sections. The operations with beams of heavy nuclei have reached a degree of virtuosity that made it possible to collide not only the anticipated lead beams, but also beams of xenon, as well as combined proton–lead, photon–lead and photon-photon collisions, opening the way to a new generation of studies of matter at high density.
Theoretical calculations have evolved in parallel to the experimental progress. Calculations that were deemed of impossible complexity before the start of the LHC have matured and become reality. Next-to-leading-order (NLO) theoretical predictions are routinely used by the experiments, thanks to a new generation of automatic tools. The next frontier, next-to-next-to-leading order (NNLO), has been attained for many important processes, reaching, in a few cases, the next-to-next-to-next-to-leading order (N3LO), and more is coming.
Aside from having made these first 10 years an unconditional success, all these ingredients are the premise for confident extrapolations of the physics reach of the LHC programme to come.
To date, more than 2700 peer-reviewed physics papers have been published by the seven running LHC experiments (ALICE, ATLAS, CMS, LHCb, LHCf, MoEDAL and TOTEM). Approximately 10% of these are related to the Higgs boson, and 30% to searches for BSM phenomena. The remaining 1600 or so report measurements of SM particles and interactions, enriching our knowledge of the proton structure and of the dynamics of strong interactions, of electroweak (EW) interactions, of flavour properties, and more. In most cases, the variety, depth and precision of these measurements surpass those obtained by previous experiments using dedicated facilities. The multi-purpose nature of the LHC complex is unique, and encompasses scores of independent research directions. Here it is only possible to highlight a fraction of the milestone results from the LHC’s expedition so far.
Entering the Higgs world
The discovery by ATLAS and CMS of a new scalar boson in July 2012, just two years into LHC physics operations, was a crowning early success. Not only did it mark the end of a decades-long search, but it opened a new vista of exploration. At the time of the discovery, very little was known about the properties and interactions of the new boson. Eight years on, the picture has come into much sharper focus.
The structure of the Higgs-boson interactions revealed by the LHC experiments is still incomplete. Its couplings to the gauge bosons (W, Z, photon and gluons) and to the heavy third-generation fermions (bottom and top quarks, and tau leptons) have been detected, and the precision of these measurements is at best in the range of 5–10%. But the LHC findings so far have been key to establish that this new particle correctly embodies the main observational properties of the Higgs boson, as specified by the Brout–Englert–Guralnik–Hagen–Higgs–Kibble EW-symmetry breaking mechanism, referred hereafter as “BEH”, a cornerstone of the SM. To start with, the measured couplings to the W and Z bosons reflect the Higgs’ EW charges and are proportional to the W and Z masses, consistently with the properties of a scalar field breaking the SM EW symmetry. The mass dependence of the Higgs interactions with the SM fermions is confirmed by the recent ATLAS and CMS observations of the H → bb and H → ττ decays, and of the associated production of a Higgs boson together with a tt quark pair (see figure 1).
These measurements, which during Run 2 of the LHC have surpassed the five-sigma confidence level, provide the second critical confirmation that the Higgs fulfills the role envisaged by the BEH mechanism. The Higgs couplings to the photon and the gluon (g), which the LHC experiments have probed via the H → γγ decay and the gg → H production, provide a third, subtler test. These couplings arise from a combination of loop-level interactions with several SM particles, whose interplay could be modified by the presence of BSM particles, or interactions. The current agreement with data provides a strong validation of the SM scenario, while leaving open the possibility that small deviations could emerge from future higher statistics.
The process of firmly establishing the identification of the particle discovered in 2012 with the Higgs boson goes hand-in-hand with two research directions pioneered by the LHC: seeking the deep origin of the Higgs field and using the Higgs boson as a probe of BSM phenomena.
The breaking of the EW symmetry is a fact of nature, requiring the existence of a mechanism like BEH. But, if we aim beyond a merely anthropic justification for this mechanism (i.e. that, without it, physicists wouldn’t be here to ask why), there is no reason to assume that nature chose its minimal implementation, namely the SM Higgs field. In other words: where does the Higgs boson detected at the LHC come from? This summarises many questions raised by the possibility that the Higgs boson is not just “put in by hand” in the SM, but emerges from a larger sector of new particles, whose dynamics induces the breaking of the EW symmetry. Is the Higgs elementary, or a composite state resulting from new confining forces? What generates its mass and self-interaction? More generally, is the existence of the Higgs boson related to other mysteries, such as the origin of dark matter (DM), of neutrino masses or of flavour phenomena?
The Higgs boson is becoming an increasingly powerful exploratory tool to probe the origin of the Higgs itself
Ever since the Higgs-boson discovery, the LHC experiments have been searching for clues to address these questions, exploring a large number of observables. All of the dominant production channels (gg fusion, associated production with vector bosons and with top quarks, and vector-boson fusion) have been discovered, and decay rates to WW, ZZ, γγ, bb and ττ were measured. A theoretical framework (effective field theory, EFT) has been developed to interpret in a global fashion all these measurements, setting strong constraints on possible deviations from the SM. With the larger data set accumulated during Run 2, the production properties of the Higgs have been studied with greater detail, simultaneously testing the accuracy of theoretical calculations, and the resilience of SM predictions.
To explore the nature of the Higgs boson, what has not been seen as yet can be as important as what was seen. For example, lack of evidence for Higgs decays to the fermions of the first and second generation is consistent with the SM prediction that these should be very rare. The H → μμ decay rate is expected to be about 3 × 10–3 times smaller than that of H → ττ; the current sensitivity is two times below, and ATLAS and CMS hope to first observe this decay during the forthcoming Run 3, testing for the first time the couplings of the Higgs boson to second-generation fermions. The SM Higgs boson is expected to conserve flavour, making decays such as H → μτ, H → eτ or t → Hc too small to be seen. Their observation would be a major revolution in physics, but no evidence has shown up in the data so far. Decays of the Higgs to invisible particles could be a signal of DM candidates, and constraints set by the LHC experiments are complementary to those from standard DM searches. Several BSM theories predict the existence of heavy particles decaying to a Higgs boson. For example, heavy top partners, T, could decay as T → Ht, and heavy bosons X decay as X → HV (V = W, Z). Heavy scalar partners of the Higgs, such as charged Higgs states, are expected in theories such as supersymmetry. Extensive and thorough searches of all these phenomena have been carried out, setting strong constraints on SM extensions.
As the programme of characterising the Higgs properties continues, with new challenging goals such as the measurement of the Higgs self-coupling through the observation of Higgs pair production, the Higgs boson is becoming an increasingly powerful exploratory tool to probe the origin of the Higgs itself, as well as a variety of solutions to other mysteries of particle physics.
Interactions weak and strong
The vast majority of LHC processes are controlled by strong interactions, described by the quantum-chromodynamics (QCD) sector of the SM. The predictions of production rates for particles like the Higgs or gauge bosons, top quarks or BSM states, rely on our understanding of the proton structure, in particular of the energy distribution of its quark and gluon components (the parton distribution functions, PDFs). The evolution of the final states, the internal structure of the jets emerging from quark and gluons, the kinematical correlations between different objects, are all governed by QCD. LHC measurements have been critical, not only to consolidate our understanding of QCD in all its dynamical domains, but also to improve the precision of the theoretical interpretation of data, and to increase the sensitivity to new phenomena and to the production of BSM particles.
Collisions galore
Approximately 109 proton–proton (pp) collisions take place each second inside the LHC detectors. Most of them bear no obvious direct interest for the search of BSM phenomena, but even simple elastic collisions, pp → pp, which account for about 30% of this rate, have so far failed to be fully understood with first-principle QCD calculations. The ATLAS ALFA spectrometer and the TOTEM detector have studied these high-rate processes, measuring the total and elastic pp cross sections, at the various beam energies provided by the LHC. The energy dependence of the relation between the real and imaginary part of the pp forward scattering amplitude has revealed new features, possibly described by the exchange of the so-called odderon, a coherent state of three gluons predicted in the 1970s.
The structure of the final states in generic pp collisions, aside from defining the large background of particles that are superimposed on the rarer LHC processes, is of potential interest to understand cosmic-ray (CR) interactions in the atmosphere. The LHCf detector measured the forward production of the most energetic particles from the collision, those driving the development of the CR air showers. These data are a unique benchmark to tune the CR event generators, reducing the systematics in the determination of the nature of the highest-energy CR constituents (protons or heavy nuclei?), a step towards solving the puzzle of their origin.
On the opposite end of the spectrum, rare events with dijet pairs of mass up to 9 TeV have been observed by ATLAS and CMS. The study of their angular distribution, a Rutherford-like scattering experiment, has confirmed the point-like nature of quarks, down to 10–18 cm. The overall set of production studies, including gauge bosons, jets and top quarks, underpins countless analyses. Huge samples of top quark pairs, produced at 15 Hz, enable the surgical scrutiny of this mysteriously heavy quark, through its production and decays. New reactions, unobservable before the LHC, were first detected. Gauge-boson scattering (e.g. W+ W+→ W+ W+), a key probe of electroweak symmetry breaking proposed in the 1970s, is just one example. By and large, all data show an extraordinary agreement with theoretical predictions resulting from decades of innovative work (figure 2). Global fits to these data refine the proton PDFs, improving the predictions for the production of Higgs bosons or BSM particles.
The cross sections σ of W and Z bosons provide the most precise QCD measurements, reaching a 2% systematic uncertainty, dominated by the luminosity uncertainty. Ratios such as σ(W+)/σ(W–) or σ(W)/σ(Z), and the shapes of differential distributions, are known to a few parts in 1000. These data challenge the theoretical calculations’ accuracy, and require caution to assess whether small discrepancies are due to PDF effects, new physics or yet imprecise QCD calculations.
Precision is the keystone to consolidate our description of nature
As already mentioned, the success of the LHC owes a lot to its variety of beam and experimental conditions. In this context, the data at the different centre-of-mass energies provided in the two runs are a huge bonus, since the theoretical prediction for the energy-dependence of rates can be used to improve the PDF extraction, or to assess possible BSM interpretations. The LHCb data, furthermore, cover a forward kinematical region complementary to that of ATLAS and CMS, adding precious information.
The precise determination of the W and Z production and decay kinematics has also allowed new measurements of fundamental parameters of the weak interaction: the W mass (mW) and the weak mixing angle (sinθW). The measurement of sinθW is now approaching the precision inherited from the LEP experiments and SLD, and will soon improve to shed light on the outstanding discrepancy between those two measurements. The mW precision obtained by the ATLAS experiment, ΔmW = 19 MeV, is the best worldwide, and further improvements are certain. The combination with the ATLAS and CMS measurements of the Higgs boson mass (ΔmH ≅ 200 MeV) and of the top quark mass (Δmtop ≲ 500 MeV), provides a strong validation of the SM predictions (see figure 3). For both mW and sinθW the limiting source of systematic uncertainty is the knowledge of the PDFs, which future data will improve, underscoring the profound interplay among the different components of the LHC programme.
QCD matters
The understanding of the forms and phases that QCD matter can acquire is a fascinating, broad and theoretically challenging research topic, which has witnessed great progress in recent years. Exotic multi-quark bound states, beyond the usual mesons (qq) and baryons (qqq), were initially discovered at e+e– colliders. The LHCb experiment, with its large rates of identified charm and bottom final states, is at the forefront of these studies, notably with the first discovery of heavy pentaquarks (qqqcc) and with discoveries of tetraquark candidates in the charm sector (qccq), accompanied by determinations of their quantum numbers and properties. These findings have opened a new playground for theoretical research, stimulating work in lattice QCD, and forcing a rethinking of established lore.
The study of QCD matter at high density is the core task of the heavy-ion programme. While initially tailored to the ALICE experiment, all active LHC experiments have since joined the effort. The creation of a quark–gluon plasma (QGP) led to astonishing visual evidence for jet quenching, with 1 TeV jets shattered into fragments as they struggle their way out of the dense QGP volume. The thermodynamics and fluctuations of the QGP have been probed in multiple ways, indicating that the QGP behaves as an almost perfect fluid, the least viscous fluid known in nature. The ability to explore the plasma interactions of charm and bottom quarks is a unique asset of the LHC, thanks to the large production rates, which unveiled new phenomena such asthe recombination of charm quarks, and the sequential melting of bb bound states.
While several of the qualitative features of high-density QCD were anticipated, the quantitative accuracy, multitude and range of the LHC measurements have no match. Examples include ALICE’s precise determination of dynamical parameters such as the QGP shear-viscosity-to-entropy-density ratio, or the higher harmonics of particles’ azimuthal correlations. A revolution ensued in the sophistication of the required theoretical modelling. Unexpected surprises were also discovered, particularly in the comparison of high-density states in PbPb collisions with those occasionally generated by smaller systems such as pp and pPb. The presence in the latter of long-range correlations, various collective phenomena and an increased strange baryon abundance (figure 4), resemble behaviour typical of the QGP. Their deep origin is a mysterious property of QCD, still lacking an explanation. The number of new challenging questions raised by the LHC data is almost as large as the number of new answers obtained!
Flavour physics
Understanding the structure and the origin of flavour phenomena in the quark sector is one of the big open challenges of particle physics. The search for new sources of CP violation, beyond those present in the CKM mixing matrix, underlies the efforts to explain the baryon asymmetry of the universe. In addition to flavour studies with Higgs bosons and top quarks, more than 1014 charm and bottom quarks have been produced so far by the LHC, and the recorded subset has led to landmark discoveries and measurements. The rare Bs→ μμ decay, with a minuscule rate of approximately 3 × 10–9, has been discovered by the LHCb, CMS and ATLAS experiments. The rarer Bd→ μμ decay is still unobserved, but its expected ~10–10 rate is within reach. These two results alone had a big impact on constraining the parameter space of several BSM theories, notably supersymmetry, and their precision and BSM sensitivity will continue improving. LHCb has discovered DD mixing and the long-elusive CP violation in D-meson decays, a first for up-type quarks (figure 5). Large hadronic non-perturbative uncertainties make the interpretation of these results particularly challenging, leaving under debate whether the measured properties are consistent with the SM, or signal new physics. But the experimental findings are a textbook milestone in the worldwide flavour physics programme.
LHCb produced hundreds more measurements of heavy-hadron properties and flavour-mixing parameters. Examples include the most precise measurement of the CKM angle γ = (74.0+5.0–5.8)oand, with ATLAS and CMS, the first measurement of φs, the tiny CP-violation phase of Bs → J/ψϕ, whose precisely predicted SM value is very sensitive to new physics. With a few notable exceptions, all results confirm the CKM picture of flavour phenomena. Those exceptions, however, underscore the power of LHC data to expose new unexpected phenomena: B → D(*) ℓν (ℓ = μ,τ) and B → K(*)ℓ+ℓ– (ℓ = e,μ) decays hint at possible deviations from the expected lepton flavour universality. The community is eagerly waiting for further developments.
Beyond the Standard Model
Years of model building, stimulated before and after the LHC start-up by the conceptual and experimental shortcomings of the SM (e.g. the hierarchy problem and the existence of DM), have generated scores of BSM scenarios to be tested by the LHC. Evidence has so far escaped hundreds of dedicated searches, setting limits on new particles up to several TeV (figure 6). Nevertheless, much was learned. While none of the proposed BSM scenarios can be conclusively ruled out, for many of them survival is only guaranteed at the cost of greater fine-tuning of the parameters, reducing their appeal. In turn, this led to rethinking the principles that implicitly guided model building. Simplicity, or the ability to explain at once several open problems, have lost some drive. The simplest realisations of BSM models relying on supersymmetry, for example, were candidates to at once solve the hierarchy problem, provide DM candidates and set the stage for the grand unification of all forces. If true, the LHC should have piled up evidence by now. Supersymmetry remains a preferred candidate to achieve that, but at the price of more Byzantine constructions. Solving the hierarchy problem remains the outstanding theoretical challenge. New ideas have come to the forefront, ranging from the Higgs potential being determined by the early-universe evolution of an axion field, to dark sectors connected to the SM via a Higgs portal. These latter scenarios could also provide DM candidates alternative to the weakly-interacting massive particles, which so far have eluded searches at the LHC and elsewhere.
With such rapid evolution of theoretical ideas taking place as the LHC data runs progressed, the experimental analyses underwent a major shift, relying on “simplified models”: a novel model-independent way to represent the results of searches, allowing published results to be later reinterpreted in view of new BSM models. This amplified the impact of experimental searches, with a surge of phenomenological activity and the proliferation of new ideas. The cooperation and synergy between experiments and theorists have never been so intense.
Having explored the more obvious search channels, the LHC experiments refocused on more elusive signatures. Great efforts are now invested in searching corners of parameter space, extracting possible subtle signals from large backgrounds, thanks to data-driven techniques, and to the more reliable theoretical modelling that has emerged from new calculations and many SM measurements. The possible existence of new long-lived particles opened a new frontier of search techniques and of BSM models, triggering proposals for new dedicated detectors (Mathusla, CODEX-b and FASER, the last of which was recently approved for construction and operation in Run 3). Exotic BSM states, like the milli-charged particles present in some theories of dark sectors, could be revealed by MilliQan, a recently proposed detector. Highly ionising particles, like the esoteric magnetic monopoles, have been searched for by the MoEDAL detector, which places plastic tracking films cleverly in the LHCb detector hall.
While new physics is still eluding the LHC, the immense progress of the past 10 years has changed forever our perspective on searches and on BSM model building.
Final considerations
Most of the results only parenthetically cited, like the precision on the mass of the top quark, and others not even quoted, are the outcome of hundreds of years of person-power work, and would have certainly deserved more attention here. Their intrinsic value goes well beyond what was outlined, and they will remain long-lasting textbook material, until future work at the LHC and beyond improves them.
Theoretical progress has played a key role in the LHC’s progress, enhancing the scope and reliability of the data interpretation. Further to the developments already mentioned, a deeper understanding of jet structure has spawned techniques to tag high-pT gauge and Higgs bosons, or top quarks, now indispensable in many BSM searches. Innovative machine-learning ideas have become powerful and ubiquitous. This article has concentrated only on what has already been achieved, but the LHC and its experiments have a long journey of exploration ahead.
The terms precision and discovery, applied to concrete results rather than projections, well characterise the LHC 10-year legacy. Precision is the keystone to consolidate our description of nature, increase the sensitivity to SM deviations, give credibility to discovery claims, and to constrain models when evaluating different microscopic origins of possible anomalies. The LHC has already fully succeeded in these goals. The LHC has also proven to be a discovery machine, and in a context broader than just Higgs and BSM phenomena. Altogether, it delivered results that could not have been obtained otherwise, immensely enriching our understanding of nature.
Two detectors, both alike in dignity, sit 100 m underground and 8 km apart on opposite sides of the border between Switzerland and France. Different and complementary in their designs, they stand ready for anything nature might throw at them, and over the past 10 years physicists in the ATLAS and CMS collaborations have matched each other paper for paper, blazing a path into the unknown. And this is only half of the story. A few kilometres around the ring either way sit the LHCb and ALICE experiments, continually breaking new ground in the physics of flavour and colour.
Plans hatched when the ATLAS and CMS collaborations formed in the spring of 1992 began to come to fruition in the mid 2000s. While liquid-argon and tile calorimeters lit up in ATLAS’s cavern, cosmic rays careened through partially assembled segments of each layer of the CMS detector, which was beginning to be integrated at the surface. “It was terrific, we were taking cosmics and everybody else was still in pieces!” says Austin Ball, who has been technical coordinator of CMS for the entire 10-year running period of the LHC so far. “The early cosmic run with magnetic field was a byproduct of our design, which stakes everything on a single extraordinary solenoid,” he explains, describing how the uniquely compact and modular detector was later lowered into its cavern in enormous chunks. At the same time, the colossal ATLAS experiment was growing deep underground, soon to be enveloped by the magnetic field generated by its ambitious system of eight air–core superconducting barrel loops, two end-caps and an inner solenoid. A thrilling moment for both experiments came on 10 September 2008, when protons first splashed off beam stoppers and across the detectors in a flurry of tracks. Ludovico Pontecorvo, ATLAS’s technical coordinator since 2015, remembers “first beam day” as a new beginning. “It was absolutely stunning,” he says. “There were hundreds of people in the control room. It was the birth of the detector.” But the mood was fleeting. On 19 September a faulty electrical connection in the LHC caused a hundred or so magnets to quench, and six tonnes of liquid helium to escape into the tunnel, knocking the LHC out for more than a year.
You have this monster and suddenly it turns into this?
Werner Riegler
The experimentalists didn’t waste a moment. “We would have had a whole series of problems if we hadn’t had that extra time,” says Ball. The collaborations fixed niggling issues, installed missing detector parts and automated operations to ease pressure on the experts. “Those were great days,” agrees Richard Jacobsson, commissioning and run coordinator of the LHCb experiment from 2008 to 2015. “We ate pizza, stayed up nights and slept in the car. In the end I installed a control monitor at home, visible from the kitchen, the living room and the dining room, with four screens – a convenient way to avoid going to the pit every time there was a problem!” The hard work paid off as the detectors came to life once again. For ALICE, the iconic moment was the first low-energy collisions in December 2009. “We were installing the detector for 10 years, and then suddenly you see these tracks on the event display…” reminisces Werner Riegler, longtime technical coordinator for the collaboration. “I bet then-spokesperson Jürgen Schukraft three bottles of Talisker whisky that they couldn’t possibly be real. You have this monster and suddenly it turns into this? Everybody was cheering. I lost the bet.”
The first high-energy collisions took place on 30 March 2010, at a centre-of-mass energy of 7 TeV, three-and-a-half times higher than the Tevatron, and a leap into terra incognita, in the words of ATLAS’s Pontecorvo. The next signal moment came on 8 November with the first heavy-ion collisions, and almost immediate insights into the quark–gluon plasma.
ALICE in wonderland
For a few weeks each year, the LHC ditches its signature proton collisions at the energy frontier to collide heavy ions such as lead nuclei, creating globules of quark–gluon plasma in the heart of the detectors. For the past 10 years, ALICE has been the best-equipped detector in the world to record the myriad tracks that spring from these hot and dense collisions of up to 416 nucleons at a time.
Like LHCb, ALICE is installed in a cavern that previously housed a LEP detector – in ALICE’s case the L3 experiment. Its tracking and particle-identification subdetectors are mostly housed within that detector’s magnet, fixed in place and still going strong since 1989, the only worry a milli-Amp leak current, present since L3 days, which shifters monitor watchfully. Its relatively low field is not a limitation as ALICE’s specialist subject is low-momentum tracks – a specialty made possible by displacing the beams at the interaction point to suppress the luminosity. “The fact that we have a much lower radiation load than ATLAS, CMS and LHCb allows us to use technologies that are very good for low-momentum measurements, which the other experiments cannot use because their radiation-hardness requirements are much higher,” says Riegler, noting that the design of ALICE requires less power, less cooling and a lower material budget. “This also presents an additional challenge in data processing and analysis in terms of reconstructing all these low-momentum particles, whereas for the other experiments, this is background that you can cut away.” The star performer in ALICE has been the time-projection chamber (TPC), he counsels me, describing a detector capable of reconstructing the 8000 tracks per rapidity unit that were forecast when the detector was designed.
But nature had a surprise in store when the LHC began running with heavy ions. The number of tracks produced was a factor three lower than expected, allowing ALICE to push the TPC to higher rates and collect more data. By the end of Run 2, a detector designed to collect “minimum- bias” events at 50 Hz was able to operate at 1 kHz – a factor 20 larger than the initial design.
The discovery of jet quenching came simply by looking at event displays in the control room
Ludovico Pontecorvo
The lower-than-expected track multiplicities also had a wider effect among the LHC experiments, making ATLAS, CMS and LHCb highly competitive for certain heavy-ion measurements, and creating a dynamic atmosphere in which insights into the quark–gluon plasma came thick and fast. Even independently of the less-taxing-than-expected tracking requirements, top-notch calorimetry allowed immediate insights. “The discovery of jet quenching came simply by looking at event displays in the control room,” confirms Pontecorvo of ATLAS. “You would see a big jet that wasn’t counterbalanced on the other side of the detector. This excitement was transmitted across the world.”
Keeping cool
Despite the exceptional and expectation-busting performance of the experiments, the first few years were testing times for the physicists and engineers tasked with keeping the detectors in rude health. “Every year we had some crisis in cooling the calorimeters,” recalls Pontecorvo. Fortunately, he says, ATLAS opted for “under-pressure” cooling, which prevents water spilling in the event of a leak, but still requires a big chunk of the calorimeter to be switched off. The collaboration had to carry out spectacular interventions, and put people in places that no one would have guessed would be possible, he says. “I remember crawling five metres on top of the end-cap calorimeter to arrive at the barrel calorimeter to search for a leak, and using 24 clamps to find which one of 12 cooling loops had the problem – a very awkward situation!” Ball recalls experiencing similar difficulties with CMS. There are 11,000 joints in the copper circuits of the CMS cooling system, and a leak in any one is enough to cause a serious problem. “The first we encountered leaked into the high-voltage system of the muon chambers, down into the vacuum tank containing the solenoid, right through the detector, which like the LHC itself is on a slope, and out the end as a small waterfall,” says Ball.
The arresting modularity of CMS, and the relative ease of opening the detector – admittedly an odd way to describe sliding a 1500-tonne object along the axis of a 0.8 mm thick beam pipe – proved to be the solution to many problems. “We have exploited it relentlessly from day one,” says Ball. “The ability to access the pixel tracker, which is really the heart of CMS, with the highest density of sensitive channels, was absolutely vital – crucial for repairing faults as well as radiation damage. Over the course of five or six years we became very efficient at accessing it. The performance of the whole silicon tracking system has been outstanding.”
The early days were also challenging for LHCb, which is set up to reconstruct the decays of beauty hadrons in detail. The dawning realisation that the LHC would run optimally with fewer but brighter proton bunches than originally envisaged set stern tests from the start. From LHCb’s conception to first running, all of the collaboration’s discussions were based on the assumption that the detector would veto any crossing of protons where there would be more than one interaction. In the end, faced with a typical “pile-up” of three, the collaboration had to reschedule its physics priorities and make pragmatic decisions about the division of bandwidth in the high-level trigger. “We were faced with enormous problems: synchronisation crashes, event processing that was taking seconds and getting stuck…,” recalls Jacobsson. “Some run numbers, such as 1179, still send shivers down the back of my spine.” By September, however, they had demonstrated that LHCb was capable of running with much higher pile-up than anybody had thought possible.
No machine has ever been so stable in its operational mode
Rolf Lindner
Necessity was the mother of invention. In 2011 and 2012 LHCb introduced a feedback system that maintains a manageable luminosity during each fill by increasing the overlap between the colliding beams as protons “burn out” in collisions, and the brightness of the bunches decreases. When Jacobsson and his colleagues mentioned it to the CERN management in September 2010, the then director of accelerators, Steve Myers, read the riot act, warning of risks to beam stability, recalls Jacobsson. “But since I had a few good friends at the controls of the LHC, we could carefully and quietly test this, and show that it produced stable beams. This changed life on LHCb completely. The effect was that we would have one stable condition throughout every fill for the whole year – perfect for precision physics.”
Initially, LHCb had planned to write events at 200 Hz, recalls Rolf Lindner, the experiment’s longtime technical coordinator, but by the end of Run 1, LHCb was collecting data at up to 10 kHz, turning offline storage, processing and “physics stripping” into an endless fire fight. Squeezing every ounce of performance out of the LHC generated greater data volumes than anticipated by any of the experiments, and even stories (probably apocryphal) of shifters running down to local electronics stores to buy data discs because they were running out of storage. “The LHC would run for several months with stable beams for 60% of every 24 hours in a day,” says Lindner. “No machine has ever been so stable in its operational mode.”
Engineering all-stars
The eyes of the world turned to ATLAS and CMS on 4 July 2012 as the collaborations announced the discovery of a new boson – an iconic moment to validate countless hours of painstaking work by innumerable physicists, engineers and computer scientists, which is nevertheless representative of just one of a multitude of physics insights made possible by the LHC experiments (see LHC at 10: the physics legacy). The period running up to the euphoric Higgs discovery had been smooth for all except LHCb, who had to scramble to disprove unfounded suggestions that their dipole magnet, occasionally reversed in field to reduce systematic uncertainties, was causing beam instabilities. But new challenges would shortly follow. Chief among several hair-raising moments in CMS was the pollution of the magnet cryogenic system in 2015 and 2016, which caused instability in the detector’s cold box and threatened the reliable operation of the superconducting solenoid surrounding the tracker and calorimeters. The culprit turned out to be superfluous lubricant – a mere half a litre of oil, now in a bottle in Ball’s office – which clogged filters and tiny orifices crucial to the cyclical expansion cycle used to cool the helium. “By the time we caught on to it, we hadn’t just polluted the cold box, we had polluted the whole of the distribution from upstairs to downstairs,” he recalls, launching into a vivid account of seat-of-the-pants interventions, and also noting that the team turned their predicament into an opportunity. “With characteristic physics ingenuity, and faced with spoof versions of the CMS logo with straightened tracks, we exploited data with the magnet off to calibrate the calorimeters and understand a puzzling 750 GeV excess in the diphoton invariant mass distribution,” he says.
Now I look back on the cryogenic crisis as the best project I ever worked on at CERN
With resolute support from CERN, bold steps were taken to fix the problem. It transpired that slightly-undersized replaceable filter cartridges were failing to remove the oil after it was mixed with the helium to lubricate screw-turbine compressors in the surface installation. “Now I look back on the cryogenic crisis as the best project I ever worked on at CERN, because we were allowed to assemble this cross-departmental superstar engineering team,” says Ball. “You could ask for anyone and get them. Cryogenics experts, chemists and mechanical engineers… even Rolf Heuer, then the Director-General, showed up frequently. The best welders basically lived in our underground area – you could normally only see their feet sticking out from massive pipework. If you looked carefully you might spot a boot. It’s a complete labyrinth. That one will stick with me for a long time. A crisis can be memorable and satisfying if you solve it.”
Heroic efforts
During the long shutdown that followed, the main task for LHCb was to exchange a section of beryllium beam pipe in which holes had been discovered and meticulously varnished over in haste before being used in Run 1. At the same time, right at the end of an ambitious and successful consolidation and improvement programme, CMS suffered the perils of extraordinarily dense circuit design when humid air condensed onto cold silicon sensor modules that had temporarily been moved to a surface clean room. 10% of the pixels short-circuited when it was powered up again, and heroic efforts were needed to re-manufacture replacements and install them in time for the returning LHC beams. Meanwhile, wary of deteriorating optical readout, ATLAS refurbished their pixel-detector cabling, taking electronics out of the detector to make it serviceable and inserting a further inner pixel layer just 33 mm from the beam pipe to up their b-tagging game. The bigger problem was mechanical shearing of the bellows that connect the cryostat of one of the end-cap toroids to the vacuum system – the only problem experienced so far with ATLAS’s ambitious magnet system. “At the beginning people speculated that with eight superconducting coils, each independent from the others, we would experience one quench after another, but they have been perfect really,” confirms Pontecorvo. Combined with the 50-micron alignment of the 45 m-long muon detector, ATLAS has exceeded the design specifications for resolving the momentum of high-momentum muons – just one example of a pattern repeated across all the LHC detectors.
As the decade wore on, the experiments streamlined operations to reach unparalleled performance levels, and took full advantage of technical and end-of-year stops to keep their detectors healthy. Despite their very high-luminosity environments, ATLAS and CMS pushed already world-beating initial data-taking efficiencies of around 90% beyond the 95% mark. “ATLAS and CMS were designed to run with an average pile-up of 20, but are now running with a pile-up of 60. This is remarkable,” states Pontecorvo.
Accelerator rising
At 10, with thousands of physics papers behind them and many more stories to tell, the LHC experiments are as busy as ever, using the second long shutdown, which is currently underway, to install upgrades, many of which are geared to the high-luminosity LHC (HL-LHC) due to operate later this decade. Many parts are being recycled, for example with ALICE’s top-performing TPC chambers donated to Fermilab for the near detector of the DUNE long-baseline neutrino-oscillation experiment. And major engineering challenges remain. A vivid example is that the LHC tunnel, carved out of water-laden rock 30 years ago, is rising up, while the experiments – particularly the very compact CMS, which has a density almost the same as rock – remain fixed in place, counterbalancing upthrust due to the removed rock with their weight. CMS faces the greatest challenge due to the geology of the region, explains Ball. “The LHC can use a corrector magnet to adjust the level of the beam, but there is a risk of running out of magnetic power if the shifts are big. Just a few weeks ago they connected a parallel underground structure for HL-LHC equipment, and the whole tunnel went up 3 mm almost overnight. We haven’t solved that one yet.”
Most of all, it is important to acknowledge the dedication of the people who run the experiments
Ludovico Pontecorvo
Everyone I interviewed agrees wholeheartedly on one crucial point. “Most of all, it is important to acknowledge the dedication of the people who run the experiments,” explains Pontecorvo of ATLAS, expressing a sentiment emphasised by his peers on all the experiments. “These people are absolutely stunning. They devote their life to this work. This is something that we have to keep and which it is not easy to keep. Unfortunately, many feel that this work is undervalued by selection committees for academic positions. This is something that must change, or our work will finish – as simple as that.”
Pontecorvo hurries out of the door at the end of our early-morning interview, hastily squeezed into a punishing schedule. None of the physicists I interviewed show even a smidgen of complacency. Ten years in, the engineering and technological marvels that are the four biggest LHC experiments are just getting started.
The start-up of the LHC was an exciting time and the culmination of years of work, made manifest in the process of establishing circulating beams, ramping, squeezing and producing the first collisions. The two major events of the commissioning era were first circulating beams on 10 September 2008 and first high-energy collisions on 30 March 2010. For both of these events the CERN press office saw fit to invite the world’s media, set up satellite links, arrange numerous interviews and such. Combined with the background attention engendered by the LHC’s potential to produce miniature black holes and the LHC’s supporting role in the 2009 film Angels and Demons, the LHC enjoyed a huge amount of coverage, and in some sense became a global brand in the process (CERN Courier September 2018 p44).
The LHC is one of biggest, most complex and powerful instruments ever built. The large-scale deployment of the main two-in-one dipoles and quadrupoles cooled to 1.9 K by superfluid helium is unprecedented even in particle physics. Many unforeseen issues had to be dealt with in the period before start-up. A well-known example was that of the “collapsing fingers”. In the summer of 2007, experts realised that the metallic modules responsible for the electrical continuity between different vacuum pipe sections in the magnet interconnects could occasionally become distorted as the machine was warmed up. This distortion led to a physical obstruction of the beam pipe. The solution was surprisingly low-tech: to blow a ping-pong-sized ball fitted with a 40 MHz transmitter through the pipes and find out where it got stuck.
The commissioning effort was clearly punctuated by the electrical incident that occurred during high-current tests on 19 September 2008, just nine days after the success of “first beam day”. Although the incident was a severe blow to CERN and the LHC community, it did provide a hiatus of which full use was made (see A labour of love). The LHC and experiments returned at “an unprecedented state of readiness” and beam was circulated again on 20 November 2009. Rapid progress followed. Collisions with stable beam conditions were quickly established at 450 GeV, and a ramp to the maximum beam energy at the time (1.18 TeV, compared to the Tevatron’s 0.98 TeV) was successfully achieved on 30 November. All beam-based systems were at least partially commissioned and LHC operators managed to start to master the control of a hugely complex machine.
After the 2009 Christmas technical stop, which saw continued deployment of the upgraded quench-protection system that had been put in place following the 2008 incident, commissioning started again in the new year. Progress was rapid, with first colliding beams at 3.5 TeV being established on 30 March 2010. It was a tense day in the control room with the scheduled collisions delayed by two unsuccessful ramps and all under the watchful eye of the media. In the following days, squeeze-commissioning successfully reduced the β* parameter (which is related to the transverse size of the beam at the interaction points) to 2.0 m in ATLAS and CMS. Stable beams were declared, and the high-energy exploitation of the four main LHC experiments could begin in earnest.
Tales from Run 1
Essentially 2010 was devoted to commissioning and then establishing confidence in operational procedures and the machine protection system before starting the process of ramping up the number of bunches in the beam.
In June the decision was taken to go for bunches with nominal population (1.15 × 1011 protons), which involved another extended commissioning period. Up to this point, only around one fifth of the nominal bunch population was used. To further increase the number of bunches, the move to bunch trains separated by 150 ns was made and the crossing angles spanning the experiments’ insertion regions brought in. This necessitated changes to the tertiary collimators and a number of ramps and squeezes. We then performed a carefully phased increase in total intensity. The proton run finished with beams of 368 bunches of around 1.2 × 1011 protons per bunch, and a peak luminosity of 2.1 × 1032 cm–2s–1, followed by a successful four-week long lead–lead ion run.
The initial 50 and 25 ns intensity ramp-up phase was tough going
In 2011 it was decided to keep the LHC beam energy at 3.5 TeV, and to operate with 50 ns bunch spacing – opening the way to significantly more bunches per beam. Following several weeks of commissioning, a staged ramp-up in the number of bunches took us to a maximum of 1380 bunches. Reducing the transverse size of the beams delivered by the injectors and gently increasing the bunch population resulted in a peak luminosity of 2.4 × 1033 cm–2s–1 and some healthy luminosity-delivery rates. Following a reduction in β* in ATLAS and CMS from 1.5 m to 1.0 m, and further gradual increases in bunch population, the LHC achieved a peak luminosity of 3.8 × 1033 cm–2s–1 – well beyond expectations at the start of the year – and delivered a total of around 5.6 fb–1 to both ATLAS and CMS.
2012 was a production year at an increased beam energy of 4 TeV, with 50 ns bunch spacing and 1380 bunches. A decision to operate with tighter collimator settings allowed a more aggressive squeeze to a β* of 0.6 m, and the peak luminosity was quickly close to its maximum for the year, followed by determined and long-running attempts to improve peak performance. Beam instabilities, although never debilitating, were a reoccurring problem and there were phases when they cut into operational efficiency. By the middle of the year another 6 fb–1 had been delivered to both ATLAS and CMS. Combined with the 2011 dataset, this paved the way for the announcement of the Higgs discovery on 4 July 2012. It was a very long operational year and included the extension of the proton–proton run until December, resulting in the shift of a four-week-long proton–lead run to 2013. Integrated-luminosity rates were healthy at around the 1 fb–1 per-week level and this allowed a total for the year of about 23 fb–1 to be delivered to both ATLAS and CMS.
Caused by beam-induced radiation to tunnel electronics, these were a serious cause of inefficiency in the LHC’s early days. However, the problem had been foreseen and its impact was considerably reduced following a sustained programme of mitigation measures – including shielding campaigns prior to the 2011 run.
Unidentified falling objects
Microscopic particles of the order of 10 microns across, which fall from the top of the vacuum chamber or beam screen, become ionised by collisions with circulating protons and are then repelled by the positively charged beam. While interacting with the circulating protons they generate localised beam loss, which may be sufficient to dump the beam or, in the limit, cause a quench. During the first half of 2015 they were a serious issue, but happily they have subsequently conditioned down in frequency.
Beam-induced heating
This is where regions of the LHC near the beam become too warm, and has been a long-running issue. Essentially, all cases have been local and, in some way, due to non-conformities either in design or installation. Design problems have affected the injection protection devices and the mirror assemblies of the synchrotron radiation telescopes, while installation problems have occurred in a low number of vacuum assemblies. These issues have all been addressed and are not expected to be a problem in the long term.
Beam instabilities
This was an interesting problem that occasionally dogged operations. Operations with 25 ns bunch spacing and lower bunch population have meant that intrinsically instabilities should have been less of an issue. However, high electron cloud (see “Electron cloud effects”) also proved to be a driver and defence mechanisms were deployed in the form of high-chromaticity, high-octupole field strength, and the all-important transverse damper system.
Electron cloud effects
These result from an avalanche-like process in which electrons from gas ionisation or photo-emission are accelerated in the electromagnetic field of the beam and hit the beam-chamber walls with energies of a few hundreds of eV, producing more electrons. This can lead to beam oscillations and blow-up of the proton bunches. “Scrubbing”, the deliberate invocation of high electron cloud with beam, provides a way to reduce or suppress subsequent electron cloud build-up. Extensive scrubbing was needed for 25 ns running. Conditioning thereafter has been slow and the heat load from electron cloud to cryogenics system remained a limitation in 2018.
To Run 2 and beyond
In early 2015 the LHC emerged from “long-shutdown one”. The aims were to re-commission the machine without beam following major consolidation and upgrades, and from a beam perspective to safely establish operations at 6.5 TeV with 25 ns bunch spacing and around 2800 bunches. This was anticipated to be more of a challenge than previous operations at 4 TeV with 50 ns beams. Increased energy implies lower quench margins and thus lower tolerance to beam loss, with hardware pushed closer to maximum with potential knock-on effects to availability. A 25 ns beam was anticipated to have significantly higher electron-cloud effects (see “Five phrases LHC operators learned to love” box) than that experienced with 50 ns; in addition, there was a higher total beam current and higher intensity per injection. All of these factors came into play to make 2015 a challenging year.
The initial 50 and 25 ns intensity ramp-up phase was tough going and had to contend with a number of issues, including earth faults, unidentified falling objects, an unidentified aperture restriction in a main dipole, and radiation affecting specific electronic components in the tunnel. Nonetheless, the LHC was able to operate with up to 460 bunches and deliver some luminosity to the experiments, albeit with poor efficiency. The second phase of the ramp-up, following a technical stop at the start of September, was dominated by the electron–cloud-generated heat load and the subsequent challenge for the cryogenics, which had to wrestle with transients and operation close to their cooling power limits. The ramp-up in number of bunches was consequently slow but steady, culminating in the final figure for the year of 2244 bunches per beam. Importantly, the electron cloud generated during physics operations at 6.5 TeV served to slowly condition the surface of the beam screens in the cold sectors and so reduce the heat load at a given intensity. As time passed, this effect opened a margin for the use of more bunches.
The overall machine availability remained respectable with around 32% of the scheduled time spent in “stable beams” mode during the final period of proton–proton physics from September to November. By the end of the 2015 proton run, 2244 bunches per beam were giving peak luminosities of 5.5 × 1033 cm–2s–1 in the high-luminosity experiments, with a total integrated luminosity of around 4 fb–1 delivered to both ATLAS and CMS. Levelled luminosities of 3 × 1032 cm–2s–1 in LHCb and 5 × 1030 cm–2s–1 in ALICE were provided throughout the run.
A luminous future
Following an interesting year, 2016 was the first full year of exploitation at 6.5 TeV. The beam size at the interaction point was further reduced (β* = 0.4 m) and the LHC design luminosity of 1034 cm–2s–1 was achieved. Reasonable machine availability allowed a total of 40 fb–1 to be delivered to both ATLAS and CMS. 2017 saw a further reduction in beam size at the interaction point (β* = 0.3 m), which, together with small beams from the injectors, gave a peak luminosity of 2.2 × 1034 cm–2s–1. Despite the effects of an accidental ingress of air into the beam vacuum during the winter technical stop, around 50 fb–1 was delivered to ATLAS and CMS.
Not only can a 27 km superconducting collider work, it can work well!
2018 essentially followed the set-up of 2017 with a squeeze to β* = 0.3 m in ATLAS and CMS. The effects of the air ingress lingered on, limiting the maximum bunch intensity to approximately 1.2 × 1011. Despite this, the peak luminosity was systematically close to 2 × 1034 cm–2s–1 and around 63 fb–1 was delivered to ATLAS and CMS. Somewhat more integrated luminosity was possible thanks to the novel luminosity levelling strategy pursued. This involved continuous adjustment of the crossing angle in stable beams, and for the first time the LHC dynamically changed the optics in stable-beams mode, with β* reduced from 0.30 to 0.27 to 0.25 m while colliding. The year finished with a very successful lead–ion run, helped by the impressive ion delivery from the injectors. In December 2018 the machine entered long-shutdown two, recovery from which is scheduled in 2021.
It is nearly 12 years since first beam, and 10 since first high-energy operations at the LHC. The experience has shown that, remarkably, not only can a 27 km superconducting collider work, it can work well! This on the back of some excellent hardware system performance, impressive availability, high beam quality from the injectors and some fundamental operational characteristics of the LHC. Thanks to the work of many, many people over the years, the LHC is now well understood and continues to push our understanding of how to operate high-energy hadron colliders and to surpass expectations. Today, as plans for Run 3 take shape and work advances on the challenging magnets needed for the high-luminosity LHC upgrade, things promise to remain interesting.
Alvin Tollestrup, who passed away on 9 February at the age of 95, was a visionary. When I joined his group at Caltech in the summer of 1960, experiments in particle physics at universities were performed at accelerators located on campus. Alvin had helped build Caltech’s electron synchrotron, the highest energy photon-producing accelerator at the time. But he thought more exciting physics could be performed elsewhere, and managed to get approval to run an experiment at Berkeley Lab’s Bevatron to measure a rare decay mode of the K+ meson. This was the first time an outsider was allowed to access Berkeley’s machine, much to the consternation of Luis Alvarez and other university faculty.
When I joined Alvin’s group he asked a postdoc, Ricardo Gomez, and me to design, build and test a new type of particle detector called a spark chamber. He gave us a paper by two Japanese authors on “A new type of particle detector: the discharge chamber”, not what he wanted, but a place to start. In retrospect it was remarkable that Alvin was willing to risk the success of his experiment on the creation of new technology. Alvin also asked me to design a transport system of magnetic lenses that would capture as many K mesons as possible at the “thin window” of the accelerator and guide them to our “hut” on the accelerator floor where K decays would be observed. I did my calculations on an IBM 709 at UCLA — Alvin checked them by tracing rays at his drafting table. When the beam design was completed and the chain of magnets was in place on the accelerator floor, Alvin threaded a single wire through them from the thin window to our hut.
I had no idea what he was doing, or why. Around Alvin the Zen master, I didn’t say much or ask many questions. After turning the magnets on and running current through the wire, the wire snapped to attention tracing the path a K would follow from where it left the accelerator to where its decays would be observed. The wire floated through the magnet centres far from their walls, tracing an unobstructed path. Calculations − how much current was required in the wire − followed by testing, were Alvin’s modus operandi.
A couple of months later in 1962, run-time arrived. All the equipment for the experiment was built and tested over a two-year period at Caltech, shipped in a moving van to the Bevatron, and assembled in our hut. We had 21 half-days to make our measurements. The proton beam inside the accelerator was steered into a tungsten target behind the thin window through which the Ks would pass. Inside the hut we waited for the scintillation counters to start clicking wildly, but there was hardly a click. In complete silence, Alvin set out to find what happened to the beam, slowly moving a scintillation counter from one magnet to the next until he reached the thin window. Finding that hardly any Ks were coming through it, Alvin asked the operator in the control room to shut the machine down and remove the thin window to expose the target — an unprecedented request that meant losing the vacuum the proton beam required. There was a long silence while the operator mentally processed the request. Several phone calls later the operator complied. With a pair of long tongs Alvin pressed a small square of dental film against the radioactive target. When developed it showed a faintly illuminated edge at the top of the target. The Bevatron surveyors had placed the target one inch below its proper position, a big mistake. But there was no panic or finger pointing, just measurement and appropriate action. That was Alvin’s style, always diplomatic with management, never asking for something without sufficient reason, and persistent. Unfortunately, we were unfairly charged a full day of running time, which Alvin chose not to contest. Not everyone at UC Berkeley was happy with outside users coming in to use “their machine,” and Alvin did not want to antagonize them.
Without his influence, I never would have discovered quarks (aces), whose existence was later definitively confirmed in deep inelastic scattering experiments.
Alvin was my first thesis advisor. When he taught me how to think about my measurements, he also taught me how to analyze and judge the measurements of others. This was essential in understanding which of the many “discoveries” of hadrons in the early 1960s were believable. Without his influence, I never would have discovered quarks (aces), whose existence was later definitively confirmed in deep inelastic scattering experiments.
Fermilab years
More than a dozen years later, true to his belief that users of accelerators should improve them, Alvin left Caltech for Fermilab where he would create the first large-scale application of superconductivity. Physics at Fermilab at that time was limited by the energy of the protons it produced: 200 GeV, which was the design energy of the laboratory’s 6.3km circumference Main Ring. If superconducting magnets could be built, the Main Ring’s copper magnets could be replaced, energy costs could be significantly reduced, and the energy of protons could be doubled. Furthermore, protons and antiprotons could eventually be accelerated in the same ring, traveling in opposite directions, colliding at nodes around the ring where experiments could be performed. All this without digging a new tunnel.
I went to visit Alvin shortly after he arrived at Fermilab and found him at a drafting table once more tracing rays, this time through superconducting magnets. Looking up he told me of the magnetostrictive forces trying to tear each magnet apart, and the enormous energy stored within each one (as much energy as a one-tonne vehicle traveling more than 100 km h-1) all within a bath of liquid helium bombarded by stray high-energy protons. If a superconducting magnet “quenched” and returned to its normal state, this energy would suddenly be released and serious damage would occur. There was also the possibility of a domino effect, one magnet quenching after another.
With a number of ingenious inventions, always experimenting but only making one change at a time, and combining the understanding that comes from physics with the practicalities necessary for engineering, Alvin made essential contributions to the design, testing and commissioning of the superconducting magnets. When the “energy doubler”, henceforth the Tevatron, was completed in 1983, Alvin worked on converting it to a proton-antiproton collider. The collider began operation in 1987, and Alvin was the primary spokesperson for the CDF experimental collaboration from 1980 to 1992. The Tevatron was the world’s most powerful particle collider for 25 years until the LHC came along. The top quark and the tau neutrino were both discovered there. Alvin’s critical contributions to the design, construction and initial operation of the Tevatron were recognised in 1989 with a US National Medal of Technology and Innovation.
Deserved recognition
Designing robust superconducting magnets that could be mass produced was extremely difficult. Physicists at Brookhaven working on their next-generation accelerator − Isabelle – failed, even though they received substantially more government support and funding. And, ten days after the LHC was first switched on in 2008, an electrical fault in a connection between adjacent magnets caused a massive magnet quench and significant damage which closed the accelerator for several months.
The virtuosity required to create new accelerators sometimes exceeds what is necessary to run the resulting prizewinning experiments.
Alvin once told me that the Bevatron’s director, Ed Lofgren, never got the recognition he deserved. The Bevatron was designed and built to find the antiproton, and sure enough Segre and Chamberlain found it as soon as the Bevatron was turned on. They were recognised for their discovery with a Nobel Prize, but the work Lofgren did to create the machine for them was of a higher order than that required to run their experiment. Alvin also didn’t get the recognition he deserved. His modesty only exacerbated the problem. The virtuosity required to create new accelerators sometimes exceeds what is necessary to run the resulting prizewinning experiments.
Alvin remained a visionary all his life. For many years Richard Feynman kept a question carefully written in the upper left-hand corner of his blackboard: “Why does the muon weigh?” To help answer this question, and create a new frontier in high-energy physics, Alvin began work on a muon collider in the early 1990s, and interest in the collider has increased ever since.
There were things that I was never able to learn from Alvin. His intuition for electronics was beyond my grasp, a gift from the gods. That intuition helped him make one of the most important measurements of the 1950s. Parity violation had been discovered, but how was it violated? There were competing theories, championed by giants. The V − A theory predicted the existence of the decay π−→ e−ν ̄, but this decay was not seen in two independent experiments by Jack Steinberger in 1955, and Herb Anderson in 1957. As a testimony to the difficulty of this measurement, both Steinberger and Anderson were outstanding experimentalists, students of Fermi. Steinberger later shared the Nobel Prize for demonstrating that the electron and muon each have their own neutrinos. Alvin, with his knowledge of how photomultipliers worked, discovered a flaw in one of the experiments, and with collaborators at CERN, went on to find the decay at the predicted rate, validating the V − A theory of the weak interactions.
Alvin did not suffer fools gladly, but outside of work he created a community of collaborators, an extended family. He fed and entertained us. His pitchers of martinis and platters of whole hams are memorable. As a child my parents took me to a traveling circus where we saw a tight-rope performer, Karl Wallenda, who had an incredible high-wire act. Walenda is quoted as saying, “Life is on the wire. The rest is waiting.” Alvin showed us how to have fun while waiting, and shared a long and phenomenal life with us, both off − and especially on − the high wire.
Particle physicists have long coveted the advantages of a muon collider, which could offer the precision of a LEP-style electron–positron collider without the energy limitations imposed by synchrotron-radiation losses. The clean neutrino beams that could be produced by bright and well-controlled muon beams could also drive a neutrino factory. In a step towards demonstrating the technical feasibility of such machines, the Muon Ionisation Cooling Experiment (MICE) collaboration has published results showing that muon beams can be “cooled” in phase space.
“Muon colliders can in principle reach very high centre-of-mass energies and luminosities, allowing unprecedented direct searches of new heavy particles and high-precision tests of standard phenomena,” says accelerator physicist Lenny Rivkin of the Paul Scherrer Institute in Switzerland, who was not involved in the work. “Production of bright beams of muons is crucial for the feasibility of these colliders and MICE has delivered a detailed characterisation of the ionisation-cooling process – one of the proposed methods to achieve such muon beams. Additional R&D is required to demonstrate the feasibility of such colliders.”
MICE has delivered a detailed characterisation of the ionisation-cooling process
Lenny Rivkin
The potential benefits of a muon collider come at a price, as muons are unstable and much harder to produce than electrons. This imposes major technical challenges and, not least, a 2.2 µs stopwatch on accelerator physicists seeking to accelerate muons to longer lifetimes in the relativistic regime. MICE has demonstrated the essence of a technique called ionisation cooling, which squeezes the watermelon-sized muon bunches created by smashing protons into targets into a form that can be fed into the accelerating structures of a neutrino factory or the more advanced subsequent cooling stage required for a muon collider – all on a time frame short compared to the muon lifetime.
An alternative path to a muon collider or neutrino factory is the recently proposed Low Emittance Muon Accelerator (LEMMA) scheme, whereby a naturally cool muon beam would be obtained by capturing muon–antimuon pairs created in electron–positron annihilations.
Playing it cool
Based at Rutherford Appleton Laboratory (RAL) in the UK, and two decades in the making, MICE set out to reduce the spatial extent, or more precisely the otherwise approximately conserved phase-space volume, of a muon beam by passing it through a low-Z material while tightly focused, and then restoring the lost longitudinal momentum in such a way that the beam remains bunched and matched. This is only possible in low-Z materials where multiple scattering is small compared to energy loss via ionisation. The few-metre long MICE facility, which precisely measured the phase-space coordinates of individual muons upstream and downstream of the absorber (see figure), received muons generated by intercepting the proton beam from the ISIS facility with a cylindrical titanium target. The absorber was either liquid hydrogen in a tank with thin windows or solid lithium hydride, in both cases surrounded by coils to achieve the necessary tight focus, and maximise transverse cooling.
A full muon-ionisation cooling channel would work by progressively damping the transverse momentum of muons over multiple cooling cells while restoring lost longitudinal momentum in radio-frequency cavities. However, due to issues with the spectrometer solenoids and the challenges of integrating the four-cavity linac module with the coupling coil, explains spokesperson Ken Long of Imperial College London, MICE adopted a simplified design without cavities. “MICE has demonstrated ionisation cooling,” says Long. The next issues to be addressed, he says, are to demonstrate the engineering integration of a demonstrator in a ring, cooling down to the lower emittances needed at a muon collider, and investigations into the effect of bulk ionisation on absorber materials. “The execution of a 6D cooling experiment is feasible – and is being discussed in the context of the Muon Collider Working Group.”
Twists and turns
The MICE experiment took data during 2017 and the collaboration confirmed muon cooling by observing an increased number of “low-amplitude” muons after the passage of the muon beam through an absorber. In this context, the amplitude is an additive contribution to the overall emittance of the beam, with a lower emittance corresponding to a higher density of muons in transverse phase space. The feat presented some extraordinary challenges, says MICE physics coordinator Chris Rogers of RAL. “We constructed a densely packed 12-coil and three-cryostat magnet assembly, with up to 5 MJ of stored energy, which was capable of withstanding 2 MN inter-coil forces,” he says. “The muons were cooled in a removable 22-litre vessel of potentially explosive liquid hydrogen contained by extremely thin aluminium windows.” The instrumentation developed to measure the correlations between the phase-space coordinates introduced by the solenoidal field is another successful outcome of the MICE programme, says Rogers, making a single-particle analysis possible for the first time in an accelerator-physics experiment.
“We started MICE in 2000 with great enthusiasm and a strong team from all continents,” says MICE founding spokesperson Alain Blondel of the University of Geneva. “It has been a long and difficult road, with many practical novelties to solve, however the collaboration has held together with exceptional resilience and the host institution never failed us. It is a great pride to see the demonstration achieved, just at a time when it becomes evident to many new people that we must include muon machines in the future of particle physics.”
Fast radio bursts (FRBs) are a relatively new mystery within astrophysics. Around 100 of these intense few-millisecond bursts of radio waves have been spotted since the first detection in 2007, and hardly anything is known about their origin. Thanks to close collaboration between different radio facilities and lessons learned from the study of previous astrophysical mysteries such as quasars, our understanding of these phenomena is evolving rapidly. During the past year or so, several FRBs have been localised in different galaxies, strongly suggesting that they are extra-galactic. A newly published FRB measurement, however, casts doubts about their underlying origin.
As recently as one year ago, only a few tens of FRBs had been measured. One of these FRBs was of particular interest because, unlike the single-event nature of all other known FRBs, it produced several radio signals within a short time scale – earning it the nickname “the repeater”. This could imply that while all other FRBs were a result of some type of cataclysmic event, the repeater was an altogether different source which just happened to produce a similar signal. Adding to the intrigue, measurements also showed it to be in a rather peculiar high-metallicity dwarf galaxy close to the supermassive black hole within this host galaxy.
Much has happened in the field of FRBs since then, mainly thanks to data from new facilities such as ASKAP in Australia, CHIME in Canada (pictured above), and FAST in China. A number of new FRBs have been detected including nine more repeaters. Additionally, the new range of facilities has allowed for more detailed location measurements, including some for non-repeating FRBs which are more challenging due to their unpredictable occurrence. Since non-repeating bursts were found to be in more conventional galaxies than that of the repeater, a fully different origin of the two types of FRBs seemed the more likely explanation.
The latest localisation measurement of an FRB, using data from CHIME and subsequent triangulation via eight radio telescopes from the European VLBI network, throws this theory into question. Writing in Nature, the international team found that another repeater was not only the closest FRB found to date (at a distance of 500 million light years), it was found in a star-forming region of a galaxy not that different from the Milky Way and therefore very different from the other localised repeating FRB. This precise localisation measurement, which allowed astronomers to pinpoint the location within an area just seven light years across, indicates that extreme environments are not required for repeater FRBs. Additionally, some of the repeated signals from this source were not strong enough to have come from any of the non-repeating FRBs as these are all at a larger distance. The latter finding casts doubt on the idea of two distinct classes of FRBs as the non-repeaters could just simply be too far away for some of their signal to reach us.
Although these latest findings give new insights in the quickly evolving field of FRBs it is clear that more measurements are required. The new radio facilities will soon make populations studies possible. Such populations studies have previously answered many questions for the fields of gamma-ray burst and quasars which in their early stages showed large similarities with the state in which FRB studies are now. Such studies could show if one of the two vastly differing environments in the two repeaters are found is simply a peculiarity or if FRBs can be produced in a range of different environments. Additionally, studies of the burst intensities and the distances of their origin will be able to prove if repeaters and non-repeaters are only different because of their distance.
The International Linear Collider (ILC), currently being considered to be hosted in the Tohoku region of Japan, has not been selected as a high-priority project in the country’s 2020 “master plan” for large research projects. The master plan, which is compiled every three years, was announced on 30 January by the Science Council of Japan (SJC). Among 31 projects which did make it onto high-priority list were the Super-B factory at KEK, the KAGRA gravitational-wave laboratory and an upgrade of the J-PARC facility.
“Even though the ILC did not go into the final shortlist, it was selected as one of the projects that went to the hearing stage indicating that the scientific merit of the ILC was recognized by the committee,” said ILC director Shin Michizono. “This allows the ILC project to move to the next phase.”
In 2012, physicists in Japan submitted a petition to the Japanese government to host the ILC, an electron–positron collider serving as a Higgs-factory. A technical design report was published the following year and, in 2017, the original ILC design was revised to reduce its centre-of-mass energy by half (to 250 GeV), shortening the machine by around a third. In 2018, the International Committee for Future Accelerators (ICFA) issued a statement of support for the project, but in March last year, Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) announced that it has “not yet reached declaration” for hosting the ILC and that the project “requires further discussion in formal academic decision-making processes such as the SCJ master plan”.
The important thing is that discussions on how to share the burden start soon.
Lyn Evans
At a press conference held on 31 January, state minister for MEXT, Koichi Hagiuda, responded positively to the contents of the SJC document. “This has been put together from the viewpoint of people representing the academic community, and we believe that it will serve as a reference for future discussions within the government. Being an international project, the ILC project requires broad support from both inside and outside the country. In light of the outcome of the Master Plan 2020, and observing the progress of other discussions such as the European Strategy for Particle Physics, we would like to carefully carry forward the discussions.”
Member of the Japanese government’s cabinet office, Naokazu Takemoto, who is minister of state for science and technology policy, said: “To put it simply, the project made it through the first round of evaluations, and there were about 60 such projects. In the second round, 31 projects were selected, and the ILC was not among them. However, this is a viewpoint of the Science Council. When considering the possibilities going forward, MEXT will look at high-priority research topics, and I hear that the ILC will be included in the list of these topics.” Responding to a question about the cost of the ILC, Takemoto continued: “The cost is to be shared among many countries, but some say that Japan needs to shoulder most of it. Even if these are the presumptions, I personally think we should strongly ask for realizing the project. It will effectively contribute to regional revitalisation. It will give back hope to people who have suffered greatly by the [damage caused by a tsunami in 2011]. Furthermore, it will give Japan’s technology an advantage to have an important share in the area of the world’s scientific research.”
MEXT representatives are expected to update the community on 20 February during the 85th meeting of ICFA at SLAC National Laboratory in the US.
“It is no surprise that the ILC is not on the SCJ list,” says Lyn Evans, director of the Linear Collider Collaboration. “It is of a different order of magnitude to any other project the committee considered. It also requires broad international collaboration. The important thing is that discussions on how to share the burden start soon.”
Following a week of discussions, the European Strategy Group has released a statement reporting convergence on recommendations to guide the future of high-energy physics in Europe. The 60-or-so delegates, among them scientific representatives from each of CERN’s member and associate-member states, directors and representatives of major European laboratories and organisations, and invitees from outside Europe, now return home. Their recommendations will be presented to the CERN Council in March and made public at an event in Budapest, Hungary, on 25 May.
Statement from the European Strategy Group after the Bad Honnef drafting meeting, 25 January
The drafting session of the European Strategy Group preparing the next European Particle Physics Strategy Update took place in Bad Honnef (Germany) between 21-25 January 2020. After a week of fruitful discussions involving senior figures of European and international particle physics, convergence was achieved on recommendations that will guide the future of the field.
The drafting session marks a key stage of the strategy update process. The attendees of the Bad Honnef drafting session successfully carried out their ambitious task of identifying a set of priorities and recommendations. They built on the impressive progress made since the last update of the European Strategy for Particle Physics, in 2013, and the rich input received from the entire particle physics community in the current update process.
The next step in this process will be to submit the document outlining the recommendations to the CERN Council. It will be discussed by the Council in March and submitted for final approval at an extraordinary Council Session on 25 May, in Budapest, Hungary. Once approved, it can be made public.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.