Comsol -leaderboard other pages

Topics

Success in scientific management

Barry Barish

Your co-Nobelists in the discovery of gravitational waves, Kip Thorne and Rainer Weiss, have both recognised your special skills in the management of the LIGO collaboration. When you landed in LIGO in 1994, what was the first thing you changed?

When I arrived in LIGO, there was a lot of dysfunction and people were going after each other. So, the first difficult problem was to make LIGO smaller, not bigger, by moving people out who weren’t going to be able to contribute constructively in the longer term. Then, I started to address what I felt were the technical and management weaknesses. Along with my colleague, Gary Sanders, who had worked with me on one of the would-be detectors for the Superconducting Super Collider (SSC) before the project was cancelled, we started looking for the kind of people that were missing in technical areas.

For example, LIGO relies on very advanced lasers but I was convinced that the laser that was being planned for, a gas laser, was not the best choice because lasers were, and still are, a very fast-moving technology and solid-state lasers were more forward-looking. Coming from particle physics, I’m used to not seeing a beam with my own eyes. So I wasn’t disturbed that the most promising lasers at that time emitted light in the infrared, instead of green, and that technology had advanced to where they could be built in industry. People who worked with interferometers were used to “little optics” on lab benches where the lasers were all green and the alignment of mirrors etc was straightforward. I asked three of the most advanced groups in the world who worked on lasers of the type we needed (Hannover in Germany, Adelaide in Australia and Stanford in California) if they’d like to work together with us, and we brought these experts into LIGO to form the core of what we still have today as our laser group.

Project management for forefront science experiments is very different, and it is hard for people to do it well

This story is mirrored in many of the different technical areas in LIGO. Physics expertise and expertise in the use of interferometer techniques were in good supply in LIGO, so the main challenge was to find expertise to develop the difficult forefront technologies that we were going to depend on to reach our ambitious sensitivity goals. We also needed to strengthen the engineering and project-management areas, but that just required recruiting very good people. Later, the collaboration grew a lot, but mostly on the data-analysis side, which today makes up much of our collaboration.

According to Gary Sanders of SLAC, “efficient management of large science facilities requires experience and skills not usually found in the repertoire of research scientists”. Are you a rare exception?

Gary Sanders was a student of Sam Ting, then he went to Los Alamos where he got a lot of good experience doing project work. For myself, I learned what was needed kind of organically as my own research grew into larger and larger projects. Maybe my personality matched the problem, but I also studied the subject. I know how engineers go about building a bridge, for example, and I could pass an exam in project management. But, project management for forefront science experiments is very different, and it is hard for people to do it well. If you build a bridge, you have a boss, and he or she has three or four people who do tasks under his/her supervision, so generally the way a large project is structured is a big hierarchical organisation. Doing a physics research project is almost the opposite. For large engineering projects, once you’ve built the bridge, it’s a bridge, and you don’t change it. When you build a physics experiment, it usually doesn’t do what you want it to do. You begin with one plan and then you decide to change to another, or even while you’re building it you develop better approaches and technologies that will improve the instruments. To do research in physics, experience tells us that we need a flat, rather than vertical, organisational style. So, you can’t build a complicated, expensive ever-evolving research project using just what’s taught in the project-management books, and you can’t do what’s needed to succeed in cost, schedule, performance, etc, in the style found in a typical physics-department research group. You have to employ some sort of hybrid. Whether it’s LIGO or an LHC experiment, you need to have enough discipline to make sure things are done on time, yet you also need the flexibility and encouragement to change things for the better. In LIGO, we judiciously adapted various project-management formalities, and used them by not interfering any more than necessary with what we do in a research environment. Then, the only problem – but admittedly a big one – is to get the researchers, who don’t like any structure, to buy into this approach.

How did your SSC experience help?

It helped with the political part, not the technical part, because I came to realise how difficult the politics and things outside of a project are. I think almost anything I worked on before has been very hard, because of what it was or because of some politics in doing it, but I didn’t have enormous problems that were totally outside my control, as we had in the SSC.

How did you convince the US government to keep funding LIGO, which has been described as the most costly project in the history of the NSF?

It’s a miracle, because not only was LIGO costly, but we didn’t have much to show in terms of science for more than 20 years. We were funded in 1994, and we made the first detection more than 20 years later. I think the miracle wasn’t me, rather we were in a unique situation in the US. Our funding agency, the NSF, has a different mission than any other agency I know about. In the US, physical sciences are funded by three big agencies. One is the DOE, which has a division that does research in various areas with national labs that have their own structures and missions. The other big agency that does physical science is NASA, and they have the challenge of safety in space. The NSF gets less money than the other two agencies, but it has a mission that I would characterise by one word: science. LIGO has so far seen five different NSF directors, but all of them were prominent scientists. Having the director of the funding agency be someone who understood the potential importance of gravitational waves, maybe not in detail, helped make NSF decide both to take such a big risk on LIGO and then continue supporting it until it succeeded. The NSF leadership understands that risk-taking is integral to making big advancements in science.

What was your role in LIGO apart from management?

I concentrated more on the technical side in LIGO than on data analysis. In LIGO, the analysis challenges are more theoretical than they are in particle physics. What we have to do is compare general relativity with what happens in a real physical phenomenon that produces gravitational waves. That involves more of a mixed problem of developing numerical relativity, as well as sophisticated data-analysis pipelines. Another challenge is the huge amount of data because, unlike at CERN, there are no triggers. We just take data all the time, so sorting through it is the analysis problem. Nevertheless, I’ve always felt and still feel that the real challenge for LIGO is that we are limited by how sensitive we can make the detector, not by how well we can do the data analysis.

What are you doing now in LIGO?

Now that I can do anything I want, I am focusing on something I am interested in and that we don’t employ very much, which is artificial intelligence and machine learning (ML). In LIGO there are several problems that could adapt themselves very well to ML with recent advances. So we built a small group of people, mostly much younger than me, to do ML in LIGO. I recently started teaching at the University of California Riverside, and have started working with young faculty in the university’s computer-science department on adapting some techniques in ML to problems in physics. In LIGO, we have a problem in the data that we call “glitches”, which appear when something that happens in the apparatus or outside world appears in the data. We need to get rid of glitches, and we use a lot of human manpower to make the data clean. This is a problem that should adapt itself very well to a ML analysis.

Now that gravitational waves have joined the era of multi-messenger astronomy, what’s the most exciting thing that can happen next?

For gravitational waves, knowing what discovery you are going to make is almost impossible because it is really a totally new probe of the universe. Nevertheless, there are some known sources that we should be able to see soon, and maybe even will in the present run. So far we’ve seen two sources of gravitational waves: a collision of two black holes and a collision of two neutron stars, but we haven’t yet seen a black hole with a neutron star going around it. They’re particularly interesting scientifically because they contain information about nuclear physics of very compact objects, and because the two objects are very different in mass and that’s very difficult to calculate using numerical relativity. So it’s not just checking off another source that we found, but new areas of gravitational-wave science. Another attractive possibility is to detect a spinning neutron star, a pulsar. This is a continuous signal that is another interesting source which we hope to detect in a short time. Actually, I’m more interested in seeing unanticipated sources where we have no idea what we’re going to see, perhaps phenomena that uniquely happen in gravity alone.

The NSF leadership understands that risk-taking is integral to making big advancements

Will we ever see gravitons?

That’s a really good question because gravitons don’t exist in Einstein’s equations. But that’s not necessarily nature, that’s Einstein’s equations! The biggest problem we have in physics is that we have two fantastic theories. One describes almost anything you can imagine on a large scale, and that’s Einstein’s equations, and the other, which describes almost too well everything you find here at CERN, is the Standard Model, which is based on quantum field theory. Maybe black holes have the feature that they satisfy Einstein’s equations and at the same time conserve quantum numbers and all the things that happen in quantum physics. What we are missing is the experimental clue, whether it’s gravitons or something else that needs to be explained by both these theories. Because theory alone has not been able to bring them together, I think we need experimental information.

Do particle accelerators still have a role in this?

We never know because we don’t know the future, but our best way of understanding what limits our present understanding has been traditional particle accelerators because we have the most control over the particles we’re studying. The unique feature of particle accelerators is that of being able to measure all the parameters of particles that we want. We’ve found the Higgs boson and that’s wonderful, but now we know that the neutrinos also have mass and the Higgs boson possibly doesn’t describe that. We have three families of particles, and a whole set of other very fundamental questions that we have no handle on at all, despite the fact that we have this nice “standard” model. So is it a good reason to go to higher energy or a different kind of accelerator? Absolutely, though it’s a practical question whether it’s doable and affordable.

What’s the current status of gravitational-wave observatories?

We will continue to improve the sensitivity of LIGO and Virgo in incremental steps over the next few years, and LIGO will add a detector in India to give better global coverage. KAGRA in Japan is also expected to come online. But we can already see that
next-generation interferometers will be needed to pursue the science in the future. A good design study, called the Einstein Telescope, has been developed in Europe. In the US we are also looking at next-generation detectors and have different ideas, which is healthy at this point. We are not limited by nature, but by our ability to develop the technologies to make more sensitive interferometers. The next generation of detectors will enable us to reach large red shifts and study gravitational-wave cosmology. We all look forward to exploiting this new area of physics, and I am sure important discoveries will emerge.

David Mark Ritson 1924–2019

David Ritson with Bjørn Wiik

David Mark Ritson, professor emeritus of physics at Stanford University, died peacefully at home on 4 November 2019, just shy of his 95th birthday. He was the last of the leaders of the original seven physics groups formed at SLAC: four of the other leaders were awarded Nobel prizes in physics.

Dave Ritson was born in London and grew up in Hampstead. His ancestors emigrated from Australia, Germany and Lithuania, and his father, a Cambridge alumnus, wrote Helpful Information and Guidance for Every Refugee, distributed in the 1930s and 1940s. Dave won scholarships to Merchant Taylors’ School and to Christ Church, Oxford. His 1948 PhD work included deploying the first high-sensitivity emulsion at the Jungfraujoch research station, and then developing it. Within the data were two particle-physics icons: the whole π → μ → e sequence, and τ-meson decay.

Dave moved to the Dublin IAS, to Rochester and to MIT, doing experiments which helped prove that the s-quark exists. His results were among many that underpinned the “τθ puzzle”, solved by the discovery of parity violation in beta and muon decay. Dave also assisted accelerator physicist Ken Robinson with the proof that stable storage of an electron beam in a synchrotron was possible. In 1961 he and Ferdinando Amman published the equation for disruption caused by colliding e+e beams. “Low beta” collider interaction regions are based on the Amman–Ritson equation.

Dave edited the book Techniques of High Energy Physics, published in 1961, and then took a faculty position in the Stanford physics department – bringing British acuity and economy to the ambitious SLAC team. Between 1964 and 1969, he and Burt Richter submitted four proposals to the US Atomic Energy Commission (AEC) for an e+e collider, all of which were rejected. Dave designed the 1.6 GeV spectro­meter in End Station A to detect proton recoils, which were used to reconstruct “missing mass” and to measure the photoproduction of hard-to-detect bosons.

After 1969 Dave founded Fermilab E-96, the Single Arm Spectrometer Facility, and obtained contributions from many institutions, including Argonne, CERN, Cornell, INFN Bari, MIT and SLAC. It was unusual for accelerator labs to support the fabrication of experiments at other lab’s facilities. Meanwhile, SLAC found internal funding for the SPEAR e+e collider, a stripped-down version of the last proposal rejected by the AEC and led by Richter, driving the epic 1974 c-quark discovery.

Dave returned to SLAC and in 1976 led the formation of the MAC collaboration for SLAC’s new PEP e+e collider. The MAC design of near-hermetic calorimetry with central and toroidal outer spectrometers is now classic. Bill Ford from Colorado used MAC to first observe the long b-quark lifetime. In 1983 Dave led the close-in tracker (vertex detector) project with the first layer only 4.6 cm from the e+e beams, and verified the long b-quark life with reduced errors.

He formally retired in 1987 but was active until 2003 in accelerator design at SLAC, CERN, Fermilab and for the SSC. He helped guide the SLC beams through their non-planar path into collision, and wrote several articles for Nature. He also contributed to the United Nations’ Intergovernmental Panel on Climate Change.

Dave was intensely devoted to his wife Edda, from Marsala, Sicily, who died in 2004, and is survived by their five children.

Vladislav Šimák 1934–2019

Vladislav Šimák

Experimental particle physicist and founder of antiproton physics in Czechoslovakia (later the Czech Republic), Vladislav Šimák, passed away on 26 June 2019. Since the early 1960s his vision and organisational skills helped shape experimental particle physics, not only in Prague, but the whole of the country.

After graduating from Charles University in Prague, he joined the group at the Institute of Physics of the Czechoslovak Academy of Sciences studying cosmic rays using emulsion techniques, earning a PhD in 1963. Though it was difficult to travel abroad at that time, Vlada got a scholarship and went to CERN, where he joined the group led by Bernard French investigating collisions of antiprotons using bubble chambers. It was there and then that his lifelong love affair with antiprotons began. He brought back to Prague film material showing the results of collisions of 5.7 GeV antiprotons and protons from a hydrogen bubble chamber, and formed a group of physicists and technicians, involving many diploma and PhD students who processed them. Vlada also fell in love with the idea of quarks as proposed by Gell-Mann and Zweig, and was the first Czech or Slovak physicist to apply a quark model to pion production in proton–antiproton collisions.

In the early 1970s, when contacts with the West were severely limited, Vlada exploited the experiences he accumulated at CERN and put together a group of Czech and Slovak physicists involved in the processing and analysis of data from proton–antiproton collisions, using the then-highest-energy beam of antiprotons (22.4 GeV) and a hydrogen bubble chamber at the Serpukhov accelerator in Russia. This experiment, which in the later stage provided collisions of antideuterons with protons and deuterons, gave many young physicists the chance to work on unique data for their PhDs and earned Vlada respect in the international community.

After the Velvet Revolution he played a pivotal role in accession to CERN membership

In the late 1980s, when the political atmosphere in Czechoslovakia eased, Vlada together with his PhD student joined the UA2 experiment at CERN’s proton–antiproton collider, where he devoted his attention to jet production. After the Velvet Revolution in November 1989 he played a pivotal role in the decision of the Czech and Slovak particle-physics community to focus on accession to CERN membership.

In 1992 Vlada took Czechoslovak particle physicists into the newly formed ATLAS collaboration, and in 1997 he joined the D0 experiment at Fermilab. He was active in ATLAS until very recently, and in 2014, in acknow­ledgment of his services to physics, the Czech Academy of Sciences awarded Vlada the Ernst Mach Medal for his contributions to the development of physics.

Throughout his life he combined his passion for physics with a love for music, for many years playing the violin in the Academy Chamber Orchestra. For many of us Vlada was a mentor, colleague and friend. We all admired his vitality and enthusiasm for physics, which was contagious. Vlada clearly enjoyed life and we very much enjoyed his company.

He will be sorely missed.

A recipe for sustainable particle physics

The SESAME light source

There has been a marked increase in awareness about climate change in society. Whether due to the recent school strikes initiated by Greta Thunberg or the destructive bushfires gripping Australia, the climate emergency has now moved up in the public’s list of concerns. Governments around the world have put in place various targets to reduce greenhouse-gas emissions as part of the Intergovernmental Panel on Climate Change (IPCC) 2015 Paris agreement. The scientific community, like others, will increasingly be expected to put in place measures to reduce its greenhouse-gas emissions. It is then timely to create structures that will minimise the carbon footprint of current and future experiments, and their researchers.

The LHC uses 1.25 TWh of electricity annually, the equivalent of powering around 300,000 homes, or roughly 2% of the annual consumption of Switzerland. Fortunately, the electricity supply of the LHC comes from France, where only about 10% of electricity is produced by fossil fuels. CERN is adopting several green initiatives. For example, it recently released plans to use hot water from a cooling plant at Point 8 of the LHC (where the LHCb detector is situated) to heat 8000 homes in the nearby town of Ferney-Voltaire. In 2015, CERN introduced an energy-management panel and the laboratory is about to publish a wide-ranging environmental report. CERN is also involved in the biennial workshop series Energy for Sustainable Science at Research Infrastructures, which started in 2011 and is where useful ideas are shared among research infrastructures. Whether it be related to high-performance computing or the LHC’s cryogenic systems, increased energy efficiency both reduces CERN’s carbon footprint and provides financial savings.

It is a moral imperative for the community to look at ways to reduce its carbon footprint

In addition to colliders, particle physics also involves detectors, some of which need particular gases for their operation or cooling. Unfortunately, some of these gases have very high global-warming potential. For example, sulphur hexa­fluoride, which is commonly used in high-voltage supplies and also in certain detectors such as the resistive plate chambers in the ATLAS muon spectrometer, causes 16,000 times more warming than CO2 over a 20-year period. Though mostly used in closed circuits, some of these gases are occasionally vented to the atmosphere or leak from detectors, and, although the quantities involved are small, it is likely that some of the gases used by current detectors are about to be banned by many countries, making them very hard to procure and their price volatile. A lot is already being done to combat this issue. At CERN, for instance, huge efforts have gone into replacing detector cooling fluids and investigating new gas mixtures.

Strategic approach

The European particle-physics community is currently completing the update of its strategy for the next five years or so, which will guide not only CERN activities but also those in all European countries. It is of the utmost importance that sustainability goals be included in this strategy. To this end, myself and my colleagues Cham Ghag and David Waters (University College London) and Francesco Spano (Royal Holloway) arrived at three main recommendations on sustainability as input into the strategy process.

Véronique Boisvert

First, as part of their grant-giving process, European laboratories and funding agencies should include criteria evaluating the energy efficiency and carbon footprint of particle-physics proposals, and should expect to see evidence that energy consumption has been properly estimated and minimised. Second, any design of a major experiment should consider plans for reduction of energy consumption, increased energy efficiency, energy recovery and carbon-offset mechanisms. (Similarly, any design for new buildings should consider the highest energy-efficiency standards.) Third, European laboratories should invest in next-generation digital meeting spaces including virtual-reality tools to minimise the need for frequent travel. Many environmental groups are calling for a frequent-flyer levy, since roughly 15% of the population take about 70% of all flights. This could potentially have a massive effect on the travel budgets of particle physicists, but it is a moral imperative for the community to look at ways to reduce this carbon footprint. Another area that the IPCC has identified will need to undergo a massive change is food. Particle physicists could send a very powerful message by choosing to have all of its work-related catering be mostly vegetarian.

Particle physics is flush with ideas for future accelerators and technologies to probe deeper into the structure of matter. CERN and particle physicists are important role models for all the world’s scientific community. Channelling some of our scientific creativity into addressing the sustainability of our own field, or even finding solutions for climate change, will produce ripples across all of society.

Anomalies persist in flavour-changing B decays

The distribution of the angular variable P5’ as a function of the mass squared of the muon pair, q2. The LHCb Run 1 results (red), those from the additional 2016 dataset only (blue), and those from both datasets (black) are shown along with the SM predictions (orange). Credit: LHCb

The LHCb collaboration has confirmed previous hints of odd behaviour in the way B mesons decay into a K*and a pair of muons, bringing fresh intrigue to the pattern of flavour anomalies that has emerged during the past few years. At a seminar at CERN on 10 March, Eluned Smith of RWTH Aachen University presented an updated analysis of the angular distributions of B0→K*0μ+μ decays based on around twice as many events than were used for the collaboration’s previous measurement reported in 2015. The result reveals a mild increase in the overall tension with the Standard Model (SM) prediction, though, at 3.3σ, more data are needed to determine the source of the effect.

The B0→K*0μ+μ decay is a promising system with which to explore physics beyond the SM. A flavour-changing neutral-current process, it involves a quark transition (b→s) which is forbidden at the lowest perturbative order in the SM, and therefore occurs only around once for every million B decays. The decay proceeds instead via higher-order penguin and box processes, which are sensitive to the presence of new, heavy particles. Such particles would enter in competing processes and could significantly change the B0→K*0μ+μ decay rate and the angular distribution of its final-state particles. Measuring angular distributions as a function of the invariant mass squared (q2) of the muon pair is of particular interest because it is possible to construct variables that depend less on hadronic modelling uncertainties.

Potentially anomalous behaviour in an angular variable called P5′ came to light in 2013, when LHCb reported a 3.7σ local deviation with respect to the SM in one q2 bin, based on 1fb-1 of data. In 2015, a global fit of different angular distributions of the B0→K*0μ+μ decays using the total Run 1 data sample of 3 fb-1 reaffirmed the puzzle, showing discrepancies of 3.4σ (later reduced to 3.0σ when using new theory calculations with an updated description of potentially large hadronic effects). In 2016, the Belle experiment at KEK in Japan performed its own angular analysis of B0→K*0μ+μ using data from electron—positron collisions and found a 2.1σ deviation in the same direction and in the same q2 region as the LHCb anomaly.

We as a community have been eagerly waiting for this measurement and LHCb has not disappointed

Jure Zupan

The latest LHCb result includes additional Run 2 data collected during 2016, corresponding to a total integrated luminosity of 4.7fb-1. It shows that the local tension of P5′ in two q2 bins between 4 and 8 GeV2/c4 reduces from 2.8 and 3.0σ, as observed in the previous analysis, to 2.5 and 2.9σ. However, a global fit to several angular observables shows that the overall tension with the SM increases from 3.0 to 3.3σ. The results of the fit also find a better overall agreement with predictions of new-physics models that contain additional vector or axial-vector contributions. However, the collaboration also makes it clear that the discrepancy could be explained by an unexpectedly large hadronic effect that is not accounted for in the SM predictions.

“We as a community have been eagerly waiting for this measurement and LHCb has not disappointed,” says theorist Jure Zupan of the University of Cincinnati. “The new measurements have moved closer to the SM predictions in the angular observables so that the combined significance of the excess remained essentially the same. It is thus becoming even more important to understand well and scrutinise the SM predictions and the claimed theory errors.”

Flavour puzzle
The latest result makes LHCb’s continued measurements of lepton-flavour universality even more important, he says. In recent years, LHCb has also found that the ratio of the rates of muonic and electronic B decays departs from the SM prediction, suggesting a violation of the key SM principle of lepton-flavour universality. Though not individually statistically significant, the measurements are theoretically very clean, and the most striking departure – in the variable known as RK — concerns B decays that proceed via the same b→s transition as B0→K*0μ+μ. This has led physicists to speculate that the two effects could be caused by the same new physics, with models involving leptoquarks or new gauge bosons in principle able to accommodate both sets of anomalies.

An update on RK based on additional Run 2 data is hotly anticipated, and the collaboration is also planning to add data from 2017-18 to the B0→K*0μ+μ angular analysis, as well as working on further analyses with b-quark transitions in mesons. LHCb also recently brought the decays of beauty baryons, which also depend on b→s transitions, to bear on the subject. Departures from the norm have also been spotted in B decays to D mesons, which involve tree-level b→c quark transitions. Such decays probe lepton-flavour universality via comparisons between tau leptons and muons and electrons but, as with RK, the individual measurements are not highly significant.

“We have not seen evidence of new physics, but neither were the B physics anomalies ruled out,” says Zupan of the LHCb result. “The wait for the clear evidence of new physics continues.”

Cosmology and the quantum vacuum

The sixth Cosmology and the Quantum Vacuum conference attracted about 60 theoreticians to the Institute of Space Sciences in Barcelona from 5 to 7 March. This year the conference marked Spanish theorist Emilio Elizalde’s 70th birthday. He is a well known specialist in mathematical physics, field theory and gravity, with over 300 publications and three monographs on the Casimir effect and zeta regularisation. He has co-authored remarkable works on viable theories of modified gravity which unify inflation with dark energy.

These meetings bring together researchers who study theoretical cosmology and various aspects of the quantum vacuum such as the Casimir effect. This quantum effect manifests itself as an attractive force which appears between plates which are extremely close to each other. As it is related to the quantum vacuum, it is expected to be important in cosmology as well, giving a kind of effective induced cosmological constant. Manuel Asorey (Zaragoza), Mike Bordag (Leipzig) and Aram Saharian (Erevan) discussed various aspects of the Casimir effect for scalars and for gauge theories. Joseph Buchbinder gave a review of one-loop effective action in supersymmetric gauge theories. Conformal quantum gravity and quantum electrodynamics in de Sitter space were presented by Enrique Alvarez (Madrid) and Drazen Glavan (Brussels), respectively.

Enrique Gaztanaga argued for two early inflationary periods

Even more attention was paid to theoretical cosmology. The evolution of the early and/or late universe in different theories of modified gravity was discussed by several delegates, with Enrique Gaztanaga (Barcelona) expressing an interesting point of view on the inflationary universe, arguing for two early inflationary periods.

Martiros Khurshyadyan and I discussed modified-gravity cosmology with the unification of inflation and dark energy, and wormholes, building on work with Emilio Elizalde. Wormholes are usually related with exotic matter, however they may in alternative gravity be caused by modifications to the gravitational equations of motion. Iver Brevik (Trondheim) gave an excellent introduction to viscosity in cosmology. Rather exotic wormholes were presented by Sergey Sushkov (Kazan), while black holes in modified gravity were discussed by Gamal Nashed (Cairo). A fluid approach to the dark-energy epoch and the addition of four forms (antisymmetric tensor fields with four indices) to late universe evolution was given by Diego Saez (Vallodolid) and Mariam Bouhmadi-Lopez (Bilbao), respectively. Novel aspects of non-standard quintessential inflation were presented by Jaime Haro (Barcelona).

Many interesting talks were given by young participants at this meeting. The exchange of ideas between cosmologists on the one side and quantum-field-theory specialists on the other will surely help in the further development of rigorous approaches to the construction of quantum gravity. It also opens the window onto a much better account of quantum effects in the history of the universe.

Scoping out the Einstein Telescope

The layout of the ETpathfinder facility

In a former newspaper printing plant in the southern Dutch town of Maastricht, the future of gravitational-wave detection is taking shape. In a huge hall, known to locals as the “big black box”, construction of a facility called ETpathfinder has just got under way, with the first experiments due to start as soon as next year. ETpathfinder will be a testing ground for the new technologies needed to detect gravitational waves in frequency ranges that the present generation of detectors cannot cover. At the same time, plans are being developed for a full-scale gravitational-wave detector, the Einstein Telescope (ET), in the Dutch–Belgian–German border region. Related activities are taking place 1500 km south in the heart of Sardinia, Italy. In 2023, one of these two sites (which have been selected from a total of six possible European locations) will be selected as the location of the proposed ET.

In 2015, the Laser Interferometer Gravitational-Wave Observatory (LIGO), which is based at two sites in the US, made the first direct detection of a gravitational wave. The Virgo observatory near Pisa in Italy came online soon afterwards, and the KAGRA observatory in Japan is about to become the third major gravitational-wave observatory in operation. All are L-shaped laser interferometers that detect relative differences in light paths between mirrors spaced far apart (4 km in LIGO; 3 km in Virgo and KAGRA) at the ends of two perpendicular vacuum tubes. A passing gravitational wave changes the relative path lengths by as little as one part in 1021, which is detectable via the interference between the two light paths. Since 2015, dozens of gravitational waves have been detected from various sources, providing a new window onto the universe. One event has already been linked to astronomical observations in other channels, marking a major step forward in multi-messenger astronomy (CERN Courier December 2017 p17).

Back in time

The ET would be at least 10 times more sensitive than Advanced LIGO and Advanced Virgo, extending its scope for detections and enabling physicists to look back much further in cosmological time. For this reason, the interferometer has to be built at least 200 m underground in a geologically stable area, its mirrors have to operate in cryogenic conditions to reduce thermal disturbance, and they have to be larger and heavier to allow for a larger and more powerful laser beam. The ET would be a triangular laser interferometer with sides of 10 km and four ultra-high vacuum tubes per tunnel. The triangle configuration is equivalent to three overlapping interferometers with two arms each, allowing sources in the sky to be pinpointed via triangulation from just one location instead of several as needed by existing observatories. First proposed more than a decade ago and estimated to cost close to €2 billion, the ET, if approved, is expected to start looking at the sky sometime in the 2030s.

The surface above the Sar-Grav laboratory in Sardinia

“In the next decade we will implement new technologies in Advanced Virgo and Advanced LIGO, which will enable about a factor-two increase in sensitivity, gaining in detection volume too, but we are reaching the limits of the infrastructure hosting the detectors, and it is clear that at a certain point these will strongly limit the progress you can make by installing new technologies,” explains Michele Punturo of INFN Perugia, who is co-chair of the international committee preparing the ET proposal. “The ET idea and its starting point is to have a new infrastructure capable of hosting further and further evolutions of the detectors for decades.”

Belgian, Dutch and German universities are investing heavily in the ETpathfinder project, which is also funded by European Union budgets for interregional development, and are considering a bid for the ET in the flowing green hills of the border region around Vaals between Maastricht (Netherlands) and Luik (Belgium). A geological study in September 2019 concluded that the area has a soft-soil top layer that provides very good environmental noise isolation for a detector built in granite-like layers 200 m below. Economic studies also show a net benefit, both regional and national, from the high-tech infrastructure the ET would need. But even if ET is not built there, ETpathfinder will still be essential to future gravitational-wave detection, stresses project leader Stefan Hild of Maastricht University. “This will become the testing ground for the disruptive technologies we will need in this field anyway,” he says.

ET in search of home

ETpathfinder is a research infrastructure, not a scale model for the future ET. Its short length means that it is not aimed at detecting gravitational waves at any point in time. The L-shaped apparatus (“Triangulating for the future” image) has two arms about 20 m long, with two large steel suspension towers each containing large mirrors. The arms meet in a central fifth steel optical tower and one of the tubes extends behind the central tower, ending in a sixth tower. The whole facility will be housed in a new climate-controlled clean room inside the hall, and placed on a new low-vibration concrete floor. ETpathfinder is not a single interferometer but consists of two separate research facilities joined at one point for shared instrumentation and support systems. The two arms could be used to test different mirrors, suspensions, temperatures or laser frequencies independently. Those are the parameters Hild and his team are focusing on to further reduce noise in the interferometers and enhance their sensitivity.

Deep-cooling the mirrors is one way to beat noise, says Hild. But it also brings huge new challenges. One is that thermal conductivity of silica glass is not perfect at deep cryogenic temperatures, leading to deformations due to local laser heating. For that reason, pure silicon has to be used, but silicon is not transparent to the conventional 1064 nm laser light used for detecting gravitational waves and to align the optical systems in the detector. Instead, a whole new laser technology at 1550 nm will have to be developed and tested, including fibre-laser sources, beam control and manipulation, and specialised low-noise sensors. “All these key technologies and more need testing before they can be scaled up to the 10 km scales of the future ET,” says Hild. Massive mirrors in pure silicon of metre-sizes have never been built, he points out, nor have silicon wire suspensions for the extreme cold payloads of more than half a tonne. Optoelectronics and sensors at 1550 nm at the noise level required for gravitational-wave detectors are also non-standard.

On paper, the new super-low noise detection technologies to be investigated by ETpathfinder will provide stunning new ways of looking at the universe with the ET. The sensitivity at low frequencies will enable researchers to actually hear the rumblings of space–time hours before spiralling black holes or neutron stars coalesce and merge. Instead of astronomers struggling to point their telescopes at the point in the sky indicated by millisecond chirps in LIGO and Virgo, they will be poised to catch the light from cosmic collisions many billions of light years away.

Archimedes weighs in on the quantum vacuum

Archimedes’ beam-balance apparatus

The Archimedes experiment, which will be situated under 200 m of rock at the Sar-Grav laboratory in the Sos Enattos mine in Sardinia, was conceived in 2002 to investigate the interaction between the gravitational field and vacuum fluctuations. Supported by a group of about 25 physicists from Italian institutes and the European Gravitational Observatory, it is also intended as a “bridge” between present- and next-generation interferometers. A separate project in the Netherlands, ETpathfinder, is performing a similar function (see main text).

Quantum mechanics predicts that the vacuum is a sea of virtual particles which contribute an energy density – although one that is tens of orders of magnitude larger than what is observed. Archimedes will attempt to shed light on the puzzle by clarifying whether virtual photons gravitate or not, essentially testing the equivalent of Archimedes’ principle in vacuum. “If the virtual photons do gravitate then they must follow the gravitational field around the Earth,” explains principal investigator Enrico Calloni of the University of Naples Federico II. “If we imagine removing part of them from a certain volume, creating a bubble, there will be a lack of weight (and pressure differences) in that volume, and the bubble will sense a force directed upwards, similar to the Archimedes force in a fluid. Otherwise, if they do not gravitate, the bubble will not experience any variation in the force even being immersed in the gravitational field.”

The experiment (pictured) will use a Casimir cavity comprising two metallic plates placed a short distance apart so that virtual photons that have too large a wavelength cannot survive and are expelled, enabling Archimedes to measure a variation of the “weight” of the quantum vacuum. Since the force is so tiny, the measurement must be modulated and performed at a frequency where noise is low, says Calloni. This will be achieved by modulating the vacuum energy contained in the cavity using plates made from a high-temperature superconductor, which exhibits transitions from a semiconducting to superconducting state and in doing so alters the reflectivity of the plates. The first prototype is ready and in March the experiment is scheduled to begin six years of data-taking. “Archimedes is a sort of spin-off of Virgo, in the sense that it uses many of the technologies learned with Virgo: low frequency, sensors. And it has a lot of requirements in common with third-generation interferometers like ET: cryogenics and low seismic noise, first and foremost,” explains Calloni. “Being able to rely on an existing lab with the right infrastructure is a very strong asset for the choice of a site for ET.”

Sardinian adventure

The Sos Enattos mine is situated in the wild and mountainous heart of Sardinia, an hour’s drive from the Mediterranean coast. More than 2000 years ago, the Romans (who, having had a hard time conquering the land, christened the region “Barbaria”) excavated around 50 km of underground tunnels to extract lead for their aqueduct pipes. Until it closed activity in 1996, the mine has been the only alternative to livestock-rearing in this area for decades. Today, the locals are hoping that Sos Enattos will be chosen as the site to host the ET. Since 2010, several underground measurement campaigns have been carried out to characterise the site in terms of environmental noise. The regional government of Sardinia is supporting the development of the “Sar-Grav” underground laboratory and its infrastructures with approximately €3.5 million, while the Italian government is supporting the upgrade of Advanced Virgo and the characterisation of the Sos Enattos site with about €17 million, as part of a strategy to make Sardinia a possible site for the ET.

Sar-Grav’s control room was completed late last year, and its first experiment – Archimedes – will soon begin (see “Archimedes weighs in on the quantum vacuum” panel), with others expected to follow. Archimedes will measure the effect of quantum interactions with gravity via the Casimir effect and, at the same time, provide a testbed to verify the technologies needed by a third-generation gravitational-wave interferometer such as the ET. “Archimedes has the same requirements as an underground interferometer: extreme silence, extreme cooling with liquid nitrogen, and the ensuing safety requirements,” explains Domenico D’Urso, a physicist from the University of Sassari and INFN.

Follow the noise

Sardinia is the oldest land in Italy and the only part of the country without significant seismic risk. The island also has a very low population density and thus low human activity. The Sos Enattos mine has very low seismic noise and the most resistant granitic rock, which was used until the 1980s to build the skyscrapers of Manhattan. Walking along the mine’s underground tunnels – past the Archimedes cavern, amidst veins of schist, quartz, gypsum and granite, ancient mining machines and giant portraits of miners bearing witness to a glorious past – an array of instruments can be seen measuring seismic noise; some of which are so sensitive that they are capable of recording the sound of waves washing against the shores of the Thyrrenian sea. “We are talking about really small sensitivities,” continues Domenico. “An interferometer needs to be able to perform measurements of 10–21, otherwise you cannot detect a gravitational wave. You have to know exactly what your system is doing, follow the noise and learn how to remove it.”

With the Einstein Telescope, we have 50 years of history ahead

The open European ET collaboration will spend the next two years characterising both the Sardinian and Netherlands sites, and then choosing which best matches the required parameters. In the current schedule, a technical design report for the ET would be completed in 2025 and, if approved, construction would take place from 2026 with first data-taking during the 2030s. “As of then, wherever it is built, ET will be our facility for decades, because its noise will be so low that any new technology that at present we cannot even imagine could be implemented and not be limited,” says Punturo, emphasising the scientific step-change. Current detectors can see black-hole mergers occurring at a redshift of around one when the universe was six billion years old, Punturo explains, while current detectors at their final sensitivity will achieve a redshift of around two, corresponding to three billion years after the Big Bang. “But we want to observe the universe in its dark age, before stars existed. To do so, we need to increase sensitivity to a redshift tenfold and more,” he says. “With ET, we have 50 years of history ahead. It will study events from the entire universe. Gravitational waves will become a common tool just like conventional astronomy has been for the past four centuries.” 

LHC at 10: the physics legacy

Ten years have passed since the first high-energy proton–proton collisions took place at the Large Hadron Collider (LHC). Almost 20 more are foreseen for the completion of the full LHC programme. The data collected so far, from approximately 150 fb–1 of integrated luminosity over two runs (Run 1 at a centre-of-mass energy of 7 and 8 TeV, and Run 2 at 13 TeV), represent a mere 5% of the anticipated 3000 fb–1 that will eventually be recorded. But already their impact has been monumental.

In Search of the Higgs Boson

Three major conclusions can be drawn frofm these first 10 years. First and foremost, Run 1 has shown that the Higgs boson – the previously missing, last ingredient of the Standard Model (SM) – exists. Secondly, the exploration of energy scales as high as several TeV has further consolidated the robustness of the SM, providing no compelling evidence for phenomena beyond the SM (BSM). Nevertheless, several discoveries of new phenomena within the SM have emerged, underscoring the power of the LHC to extend and deepen our understanding of the SM dynamics, and showing the unparalleled diversity of phenomena that the LHC can probe with unprecedented precision.

Exceeding expectations

Last but not least, we note that 10 years of LHC operations, data taking and data interpretation, have overwhelmingly surpassed all of our most optimistic expectations. The accelerator has delivered a larger than expected luminosity, and the experiments have been able to operate at the top of their ideal performance and efficiency. Computing, in particular via the Worldwide LHC Computing Grid, has been another crucial driver of the LHC’s success. Key ingredients of precision measurements, such as the determination of the LHC luminosity, or of detection efficiencies and of backgrounds using data-driven techniques beyond anyone’s expectations, have been obtained thanks to novel and powerful techniques. The LHC has also successfully provided a variety of beam and optics configurations, matching the needs of different experiments and supporting a broad research programme. In addition to the core high-energy goals of the ATLAS and CMS experiments, this has enabled new studies of flavour physics and of hadron spectroscopy, of forward-particle production and total hadronic cross sections. The operations with beams of heavy nuclei have reached a degree of virtuosity that made it possible to collide not only the anticipated lead beams, but also beams of xenon, as well as combined proton–lead, photon–lead and photon-photon collisions, opening the way to a new generation of studies of matter at high density.

Figure 1

Theoretical calculations have evolved in parallel to the experimental progress. Calculations that were deemed of impossible complexity before the start of the LHC have matured and become reality. Next-to-leading-order (NLO) theoretical predictions are routinely used by the experiments, thanks to a new generation of automatic tools. The next frontier, next-to-next-to-leading order (NNLO), has been attained for many important processes, reaching, in a few cases, the next-to-next-to-next-to-leading order (N3LO), and more is coming.

Aside from having made these first 10 years an unconditional success, all these ingredients are the premise for confident extrapolations of the physics reach of the LHC programme to come.

To date, more than 2700 peer-reviewed physics papers have been published by the seven running LHC experiments (ALICE, ATLAS, CMS, LHCb, LHCf, MoEDAL and TOTEM). Approximately 10% of these are related to the Higgs boson, and 30% to searches for BSM phenomena. The remaining 1600 or so report measurements of SM particles and interactions, enriching our knowledge of the proton structure and of the dynamics of strong interactions, of electroweak (EW) interactions, of flavour properties, and more. In most cases, the variety, depth and precision of these measurements surpass those obtained by previous experiments using dedicated facilities. The multi-purpose nature of the LHC complex is unique, and encompasses scores of independent research directions. Here it is only possible to highlight a fraction of the milestone results from the LHC’s expedition so far.

Entering the Higgs world

The discovery by ATLAS and CMS of a new scalar boson in July 2012, just two years into LHC physics operations, was a crowning early success. Not only did it mark the end of a decades-long search, but it opened a new vista of exploration. At the time of the discovery, very little was known about the properties and interactions of the new boson. Eight years on, the picture has come into much sharper focus.

The structure of the Higgs-boson interactions revealed by the LHC experiments is still incomplete. Its couplings to the gauge bosons (W, Z, photon and gluons) and to the heavy third-generation fermions (bottom and top quarks, and tau leptons) have been detected, and the precision of these measurements is at best in the range of 5–10%. But the LHC findings so far have been key to establish that this new particle correctly embodies the main observational properties of the Higgs boson, as specified by the Brout–Englert–Guralnik–Hagen–Higgs–Kibble EW-symmetry breaking mechanism, referred hereafter as “BEH”, a cornerstone of the SM. To start with, the measured couplings to the W and Z bosons reflect the Higgs’ EW charges and are proportional to the W and Z masses, consistently with the properties of a scalar field breaking the SM EW symmetry. The mass dependence of the Higgs interactions with the SM fermions is confirmed by the recent ATLAS and CMS observations of the H → bb and H → ττ decays, and of the associated production of a Higgs boson together with a tt quark pair (see figure 1).

Figure 2

These measurements, which during Run 2 of the LHC have surpassed the five-sigma confidence level, provide the second critical confirmation that the Higgs fulfills the role envisaged by the BEH mechanism. The Higgs couplings to the photon and the gluon (g), which the LHC experiments have probed via the H → γγ decay and the gg → H production, provide a third, subtler test. These couplings arise from a combination of loop-level interactions with several SM particles, whose interplay could be modified by the presence of BSM particles, or interactions. The current agreement with data provides a strong validation of the SM scenario, while leaving open the possibility that small deviations could emerge from future higher statistics.

The process of firmly establishing the identification of the particle discovered in 2012 with the Higgs boson goes hand-in-hand with two research directions pioneered by the LHC: seeking the deep origin of the Higgs field and using the Higgs boson as a probe of BSM phenomena.

The breaking of the EW symmetry is a fact of nature, requiring the existence of a mechanism like BEH. But, if we aim beyond a merely anthropic justification for this mechanism (i.e. that, without it, physicists wouldn’t be here to ask why), there is no reason to assume that nature chose its minimal implementation, namely the SM Higgs field. In other words: where does the Higgs boson detected at the LHC come from? This summarises many questions raised by the possibility that the Higgs boson is not just “put in by hand” in the SM, but emerges from a larger sector of new particles, whose dynamics induces the breaking of the EW symmetry. Is the Higgs elementary, or a composite state resulting from new confining forces? What generates its mass and self-interaction? More generally, is the existence of the Higgs boson related to other mysteries, such as the origin of dark matter (DM), of neutrino masses or of flavour phenomena?

The Higgs boson is becoming an increasingly powerful exploratory tool to probe the origin of the Higgs itself

Ever since the Higgs-boson discovery, the LHC experiments have been searching for clues to address these questions, exploring a large number of observables. All of the dominant production channels (gg fusion, associated production with vector bosons and with top quarks, and vector-boson fusion) have been discovered, and decay rates to WW, ZZ, γγ, bb and ττ were measured. A theoretical framework (effective field theory, EFT) has been developed to interpret in a global fashion all these measurements, setting strong constraints on possible deviations from the SM. With the larger data set accumulated during Run 2, the production properties of the Higgs have been studied with greater detail, simultaneously testing the accuracy of theoretical calculations, and the resilience of SM predictions.

Figure 3

To explore the nature of the Higgs boson, what has not been seen as yet can be as important as what was seen. For example, lack of evidence for Higgs decays to the fermions of the first and second generation is consistent with the SM prediction that these should be very rare. The H → μμ decay rate is expected to be about 3 × 10–3 times smaller than that of H → ττ; the current sensitivity is two times below, and ATLAS and CMS hope to first observe this decay during the forthcoming Run 3, testing for the first time the couplings of the Higgs boson to second-generation fermions. The SM Higgs boson is expected to conserve flavour, making decays such as H → μτ, H → eτ or t → Hc too small to be seen. Their observation would be a major revolution in physics, but no evidence has shown up in the data so far. Decays of the Higgs to invisible particles could be a signal of DM candidates, and constraints set by the LHC experiments are complementary to those from standard DM searches. Several BSM theories predict the existence of heavy particles decaying to a Higgs boson. For example, heavy top partners, T, could decay as T → Ht, and heavy bosons X decay as X → HV (V = W, Z). Heavy scalar partners of the Higgs, such as charged Higgs states, are expected in theories such as supersymmetry. Extensive and thorough searches of all these phenomena have been carried out, setting strong constraints on SM extensions.

As the programme of characterising the Higgs properties continues, with new challenging goals such as the measurement of the Higgs self-coupling through the observation of Higgs pair production, the Higgs boson is becoming an increasingly powerful exploratory tool to probe the origin of the Higgs itself, as well as a variety of solutions to other mysteries of particle physics.

Interactions weak and strong

The vast majority of LHC processes are controlled by strong interactions, described by the quantum-chromodynamics (QCD) sector of the SM. The predictions of production rates for particles like the Higgs or gauge bosons, top quarks or BSM states, rely on our understanding of the proton structure, in particular of the energy distribution of its quark and gluon components (the parton distribution functions, PDFs). The evolution of the final states, the internal structure of the jets emerging from quark and gluons, the kinematical correlations between different objects, are all governed by QCD. LHC measurements have been critical, not only to consolidate our understanding of QCD in all its dynamical domains, but also to improve the precision of the theoretical interpretation of data, and to increase the sensitivity to new phenomena and to the production of BSM particles.

Collisions galore

Approximately 109 proton–proton (pp) collisions take place each second inside the LHC detectors. Most of them bear no obvious direct interest for the search of BSM phenomena, but even simple elastic collisions, pp → pp, which account for about 30% of this rate, have so far failed to be fully understood with first-principle QCD calculations. The ATLAS ALFA spectrometer and the TOTEM detector have studied these high-rate processes, measuring the total and elastic pp cross sections, at the various beam energies provided by the LHC. The energy dependence of the relation between the real and imaginary part of the pp forward scattering amplitude has revealed new features, possibly described by the exchange of the so-called odderon, a coherent state of three gluons predicted in the 1970s.

Figure 4

The structure of the final states in generic pp collisions, aside from defining the large background of particles that are superimposed on the rarer LHC processes, is of potential interest to understand cosmic-ray (CR) interactions in the atmosphere. The LHCf detector measured the forward production of the most energetic particles from the collision, those driving the development of the CR air showers. These data are a unique benchmark to tune the CR event generators, reducing the systematics in the determination of the nature of the highest-energy CR constituents (protons or heavy nuclei?), a step towards solving the puzzle of their origin.

On the opposite end of the spectrum, rare events with dijet pairs of mass up to 9 TeV have been observed by ATLAS and CMS. The study of their angular distribution, a Rutherford-like scattering experiment, has confirmed the point-like nature of quarks, down to 10–18 cm. The overall set of production studies, including gauge bosons, jets and top quarks, underpins countless analyses. Huge samples of top quark pairs, produced at 15 Hz, enable the surgical scrutiny of this mysteriously heavy quark, through its production and decays. New reactions, unobservable before the LHC, were first detected. Gauge-boson scattering (e.g. W+ W+ W+ W+), a key probe of electroweak symmetry breaking proposed in the 1970s, is just one example. By and large, all data show an extraordinary agreement with theoretical predictions resulting from decades of innovative work (figure 2). Global fits to these data refine the proton PDFs, improving the predictions for the production of Higgs bosons or BSM particles.

The cross sections σ of W and Z bosons provide the most precise QCD measurements, reaching a 2% systematic uncertainty, dominated by the luminosity uncertainty. Ratios such as σ(W+)/σ(W) or σ(W)/σ(Z), and the shapes of differential distributions, are known to a few parts in 1000. These data challenge the theoretical calculations’ accuracy, and require caution to assess whether small discrepancies are due to PDF effects, new physics or yet imprecise QCD calculations.

Precision is the keystone to consolidate our description of nature

As already mentioned, the success of the LHC owes a lot to its variety of beam and experimental conditions. In this context, the data at the different centre-of-mass energies provided in the two runs are a huge bonus, since the theoretical prediction for the energy-dependence of rates can be used to improve the PDF extraction, or to assess possible BSM interpretations. The LHCb data, furthermore, cover a forward kinematical region complementary to that of ATLAS and CMS, adding precious information.

The precise determination of the W and Z production and decay kinematics has also allowed new measurements of fundamental parameters of the weak interaction: the W mass (mW) and the weak mixing angle (sinθW). The measurement of sinθW is now approaching the precision inherited from the LEP experiments and SLD, and will soon improve to shed light on the outstanding discrepancy between those two measurements. The mW precision obtained by the ATLAS experiment, ΔmW = 19 MeV, is the best worldwide, and further improvements are certain. The combination with the ATLAS and CMS measurements of the Higgs boson mass (ΔmH ≅ 200 MeV) and of the top quark mass (Δmtop ≲ 500 MeV), provides a strong validation of the SM predictions (see figure 3). For both mW and sinθW the limiting source of systematic uncertainty is the knowledge of the PDFs, which future data will improve, underscoring the profound interplay among the different components of the LHC programme.

QCD matters

The understanding of the forms and phases that QCD matter can acquire is a fascinating, broad and theoretically challenging research topic, which has witnessed great progress in recent years. Exotic multi-quark bound states, beyond the usual mesons (qq) and baryons (qqq), were initially discovered at e+e colliders. The LHCb experiment, with its large rates of identified charm and bottom final states, is at the forefront of these studies, notably with the first discovery of heavy pentaquarks (qqqcc) and with discoveries of tetraquark candidates in the charm sector (qccq), accompanied by determinations of their quantum numbers and properties. These findings have opened a new playground for theoretical research, stimulating work in lattice QCD, and forcing a rethinking of established lore.

Figure 5

The study of QCD matter at high density is the core task of the heavy-ion programme. While initially tailored to the ALICE experiment, all active LHC experiments have since joined the effort. The creation of a quark–gluon plasma (QGP) led to astonishing visual evidence for jet quenching, with 1 TeV jets shattered into fragments as they struggle their way out of the dense QGP volume. The thermodynamics and fluctuations of the QGP have been probed in multiple ways, indicating that the QGP behaves as an almost perfect fluid, the least viscous fluid known in nature. The ability to explore the plasma interactions of charm and bottom quarks is a unique asset of the LHC, thanks to the large production rates, which unveiled new phenomena such as  the recombination of charm quarks, and the sequential melting of bb bound states.

While several of the qualitative features of high-density QCD were anticipated, the quantitative accuracy, multitude and range of the LHC measurements have no match. Examples include ALICE’s precise determination of dynamical parameters such as the QGP shear-viscosity-to-entropy-density ratio, or the higher harmonics of particles’ azimuthal correlations. A revolution ensued in the sophistication of the required theoretical modelling. Unexpected surprises were also discovered, particularly in the comparison of high-density states in PbPb collisions with those occasionally generated by smaller systems such as pp and pPb. The presence in the latter of long-range correlations, various collective phenomena and an increased strange baryon abundance (figure 4), resemble behaviour typical of the QGP. Their deep origin is a mysterious property of QCD, still lacking an explanation. The number of new challenging questions raised by the LHC data is almost as large as the number of new answers obtained!

Flavour physics

Understanding the structure and the origin of flavour phenomena in the quark sector is one of the big open challenges of particle physics. The search for new sources of CP violation, beyond those present in the CKM mixing matrix, underlies the efforts to explain the baryon asymmetry of the universe. In addition to flavour studies with Higgs bosons and top quarks, more than 1014 charm and bottom quarks have been produced so far by the LHC, and the recorded subset has led to landmark discoveries and measurements. The rare Bs→ μμ decay, with a minuscule rate of approximately 3 × 10–9, has been discovered by the LHCb, CMS and ATLAS experiments. The rarer Bd→ μμ decay is still unobserved, but its expected ~10–10 rate is within reach. These two results alone had a big impact on constraining the parameter space of several BSM theories, notably supersymmetry, and their precision and BSM sensitivity will continue improving. LHCb has discovered DD mixing and the long-elusive CP violation in D-meson decays, a first for up-type quarks (figure 5). Large hadronic non-perturbative uncertainties make the interpretation of these results particularly challenging, leaving under debate whether the measured properties are consistent with the SM, or signal new physics. But the experimental findings are a textbook milestone in the worldwide flavour physics programme.

Figure 6

LHCb produced hundreds more measurements of heavy-hadron properties and flavour-mixing parameters. Examples include the most precise measurement of the CKM angle γ = (74.0+5.0–5.8)o and, with ATLAS and CMS, the first measurement of φs, the tiny CP-violation phase of Bs → J/ψϕ, whose precisely predicted SM value is very sensitive to new physics. With a few notable exceptions, all results confirm the CKM picture of flavour phenomena. Those exceptions, however, underscore the power of LHC data to expose new unexpected phenomena: B → D(*) ℓν (ℓ = μ,τ) and B → K(*)+ (ℓ = e,μ) decays hint at possible deviations from the expected lepton flavour universality. The community is eagerly waiting for further developments.

Beyond the Standard Model

Years of model building, stimulated before and after the LHC start-up by the conceptual and experimental shortcomings of the SM (e.g. the hierarchy problem and the existence of DM), have generated scores of BSM scenarios to be tested by the LHC. Evidence has so far escaped hundreds of dedicated searches, setting limits on new particles up to several TeV (figure 6). Nevertheless, much was learned. While none of the proposed BSM scenarios can be conclusively ruled out, for many of them survival is only guaranteed at the cost of greater fine-tuning of the parameters, reducing their appeal. In turn, this led to rethinking the principles that implicitly guided model building. Simplicity, or the ability to explain at once several open problems, have lost some drive. The simplest realisations of BSM models relying on supersymmetry, for example, were candidates to at once solve the hierarchy problem, provide DM candidates and set the stage for the grand unification of all forces. If true, the LHC should have piled up evidence by now. Supersymmetry remains a preferred candidate to achieve that, but at the price of more Byzantine constructions. Solving the hierarchy problem remains the outstanding theoretical challenge. New ideas have come to the forefront, ranging from the Higgs potential being determined by the early-universe evolution of an axion field, to dark sectors connected to the SM via a Higgs portal. These latter scenarios could also provide DM candidates alternative to the weakly-interacting massive particles, which so far have eluded searches at the LHC and elsewhere.

With such rapid evolution of theoretical ideas taking place as the LHC data runs progressed, the experimental analyses underwent a major shift, relying on “simplified models”: a novel model-independent way to represent the results of searches, allowing published results to be later reinterpreted in view of new BSM models. This amplified the impact of experimental searches, with a surge of phenomenological activity and the proliferation of new ideas. The cooperation and synergy between experiments and theorists have never been so intense.

Having explored the more obvious search channels, the LHC experiments refocused on more elusive signatures. Great efforts are now invested in searching corners of parameter space, extracting possible subtle signals from large backgrounds, thanks to data-driven techniques, and to the more reliable theoretical modelling that has emerged from new calculations and many SM measurements. The possible existence of new long-lived particles opened a new frontier of search techniques and of BSM models, triggering proposals for new dedicated detectors (Mathusla, CODEX-b and FASER, the last of which was recently approved for construction and operation in Run 3). Exotic BSM states, like the milli-charged particles present in some theories of dark sectors, could be revealed by MilliQan, a recently proposed detector. Highly ionising particles, like the esoteric magnetic monopoles, have been searched for by the MoEDAL detector, which places plastic tracking films cleverly in the LHCb detector hall.

While new physics is still eluding the LHC, the immense progress of the past 10 years has changed forever our perspective on searches and on BSM model building.

Final considerations

Most of the results only parenthetically cited, like the precision on the mass of the top quark, and others not even quoted, are the outcome of hundreds of years of person-power work, and would have certainly deserved more attention here. Their intrinsic value goes well beyond what was outlined, and they will remain long-lasting textbook material, until future work at the LHC and beyond improves them.

Theoretical progress has played a key role in the LHC’s progress, enhancing the scope and reliability of the data interpretation. Further to the developments already mentioned, a deeper understanding of jet structure has spawned techniques to tag high-pT gauge and Higgs bosons, or top quarks, now indispensable in many BSM searches. Innovative machine-learning ideas have become powerful and ubiquitous. This article has concentrated only on what has already been achieved, but the LHC and its experiments have a long journey of exploration ahead.

The terms precision and discovery, applied to concrete results rather than projections, well characterise the LHC 10-year legacy. Precision is the keystone to consolidate our description of nature, increase the sensitivity to SM deviations, give credibility to discovery claims, and to constrain models when evaluating different microscopic origins of possible anomalies. The LHC has already fully succeeded in these goals. The LHC has also proven to be a discovery machine, and in a context broader than just Higgs and BSM phenomena. Altogether, it delivered results that could not have been obtained otherwise, immensely enriching our understanding of nature.

A labour of love

The CMS detector

Two detectors, both alike in dignity, sit 100 m underground and 8 km apart on opposite sides of the border between Switzerland and France. Different and complementary in their designs, they stand ready for anything nature might throw at them, and over the past 10 years physicists in the ATLAS and CMS collaborations have matched each other paper for paper, blazing a path into the unknown. And this is only half of the story. A few kilometres around the ring either way sit the LHCb and ALICE experiments, continually breaking new ground in the physics of flavour and colour.

Plans hatched when the ATLAS and CMS collaborations formed in the spring of 1992 began to come to fruition in the mid 2000s. While liquid-argon and tile calorimeters lit up in ATLAS’s cavern, cosmic rays careened through partially assembled segments of each layer of the CMS detector, which was beginning to be integrated at the surface. “It was terrific, we were taking cosmics and everybody else was still in pieces!” says Austin Ball, who has been technical coordinator of CMS for the entire 10-year running period of the LHC so far. “The early cosmic run with magnetic field was a byproduct of our design, which stakes everything on a single extraordinary solenoid,” he explains, describing how the uniquely compact and modular detector was later lowered into its cavern in enormous chunks. At the same time, the colossal ATLAS experiment was growing deep underground, soon to be enveloped by the magnetic field generated by its ambitious system of eight air–core superconducting barrel loops, two end-caps and an inner solenoid. A thrilling moment for both experiments came on 10 September 2008, when protons first splashed off beam stoppers and across the detectors in a flurry of tracks. Ludovico Pontecorvo, ATLAS’s technical coordinator since 2015, remembers “first beam day” as a new beginning. “It was absolutely stunning,” he says. “There were hundreds of people in the control room. It was the birth of the detector.” But the mood was fleeting. On 19 September a faulty electrical connection in the LHC caused a hundred or so magnets to quench, and six tonnes of liquid helium to escape into the tunnel, knocking the LHC out for more than a year.

You have this monster and suddenly it turns into this?

Werner Riegler

The experimentalists didn’t waste a moment. “We would have had a whole series of problems if we hadn’t had that extra time,” says Ball. The collaborations fixed niggling issues, installed missing detector parts and automated operations to ease pressure on the experts. “Those were great days,” agrees Richard Jacobsson, commissioning and run coordinator of the LHCb experiment from 2008 to 2015. “We ate pizza, stayed up nights and slept in the car. In the end I installed a control monitor at home, visible from the kitchen, the living room and the dining room, with four screens – a convenient way to avoid going to the pit every time there was a problem!” The hard work paid off as the detectors came to life once again. For ALICE, the iconic moment was the first low-energy collisions in December 2009. “We were installing the detector for 10 years, and then suddenly you see these tracks on the event display…” reminisces Werner Riegler, longtime technical coordinator for the collaboration. “I bet then-spokesperson Jürgen Schukraft three bottles of Talisker whisky that they couldn’t possibly be real. You have this monster and suddenly it turns into this? Everybody was cheering. I lost the bet.”

The first high-energy collisions took place on 30 March 2010, at a centre-of-mass energy of 7 TeV, three-and-a-half times higher than the Tevatron, and a leap into terra incognita, in the words of ATLAS’s Pontecorvo. The next signal moment came on 8 November with the first heavy-ion collisions, and almost immediate insights into the quark–gluon plasma.

ALICE in wonderland

For a few weeks each year, the LHC ditches its signature proton collisions at the energy frontier to collide heavy ions such as lead nuclei, creating globules of quark–gluon plasma in the heart of the detectors. For the past 10 years, ALICE has been the best-equipped detector in the world to record the myriad tracks that spring from these hot and dense collisions of up to 416 nucleons at a time.

ALICE’s magnet

Like LHCb, ALICE is installed in a cavern that previously housed a LEP detector – in ALICE’s case the L3 experiment. Its tracking and particle-identification subdetectors are mostly housed within that detector’s magnet, fixed in place and still going strong since 1989, the only worry a milli-Amp leak current, present since L3 days, which shifters monitor watchfully. Its relatively low field is not a limitation as ALICE’s specialist subject is low-momentum tracks – a specialty made possible by displacing the beams at the interaction point to suppress the luminosity. “The fact that we have a much lower radiation load than ATLAS, CMS and LHCb allows us to use technologies that are very good for low-momentum measurements, which the other experiments cannot use because their radiation-hardness requirements are much higher,” says Riegler, noting that the design of ALICE requires less power, less cooling and a lower material budget. “This also presents an additional challenge in data processing and analysis in terms of reconstructing all these low-momentum particles, whereas for the other experiments, this is background that you can cut away.” The star performer in ALICE has been the time-projection chamber (TPC), he counsels me, describing a detector capable of reconstructing the 8000 tracks per rapidity unit that were forecast when the detector was designed.

But nature had a surprise in store when the LHC began running with heavy ions. The number of tracks produced was a factor three lower than expected, allowing ALICE to push the TPC to higher rates and collect more data. By the end of Run 2, a detector designed to collect “minimum- bias” events at 50 Hz was able to operate at 1 kHz – a factor 20 larger than the initial design.

The discovery of jet quenching came simply by looking at event displays in the control room

Ludovico Pontecorvo

The lower-than-expected track multiplicities also had a wider effect among the LHC experiments, making ATLAS, CMS and LHCb highly competitive for certain heavy-ion measurements, and creating a dynamic atmosphere in which insights into the quark–gluon plasma came thick and fast. Even independently of the less-taxing-than-expected tracking requirements, top-notch calorimetry allowed immediate insights. “The discovery of jet quenching came simply by looking at event displays in the control room,” confirms Pontecorvo of ATLAS. “You would see a big jet that wasn’t counterbalanced on the other side of the detector. This excitement was transmitted across the world.”

Keeping cool

Despite the exceptional and expectation-busting performance of the experiments, the first few years were testing times for the physicists and engineers tasked with keeping the detectors in rude health. “Every year we had some crisis in cooling the calorimeters,” recalls Pontecorvo. Fortunately, he says, ATLAS opted for “under-pressure” cooling, which prevents water spilling in the event of a leak, but still requires a big chunk of the calorimeter to be switched off. The collaboration had to carry out spectacular interventions, and put people in places that no one would have guessed would be possible, he says. “I remember crawling five metres on top of the end-cap calorimeter to arrive at the barrel calorimeter to search for a leak, and using 24 clamps to find which one of 12 cooling loops had the problem – a very awkward situation!” Ball recalls experiencing similar difficulties with CMS. There are 11,000 joints in the copper circuits of the CMS cooling system, and a leak in any one is enough to cause a serious problem. “The first we encountered leaked into the high-voltage system of the muon chambers, down into the vacuum tank containing the solenoid, right through the detector, which like the LHC itself is on a slope, and out the end as a small waterfall,” says Ball.

The ATLAS cavern

The arresting modularity of CMS, and the relative ease of opening the detector – admittedly an odd way to describe sliding a 1500-tonne object along the axis of a 0.8 mm thick beam pipe – proved to be the solution to many problems. “We have exploited it relentlessly from day one,” says Ball. “The ability to access the pixel tracker, which is really the heart of CMS, with the highest density of sensitive channels, was absolutely vital – crucial for repairing faults as well as radiation damage. Over the course of five or six years we became very efficient at accessing it. The performance of the whole silicon tracking system has been outstanding.”

The early days were also challenging for LHCb, which is set up to reconstruct the decays of beauty hadrons in detail. The dawning realisation that the LHC would run optimally with fewer but brighter proton bunches than originally envisaged set stern tests from the start. From LHCb’s conception to first running, all of the collaboration’s discussions were based on the assumption that the detector would veto any crossing of protons where there would be more than one interaction. In the end, faced with a typical “pile-up” of three, the collaboration had to reschedule its physics priorities and make pragmatic decisions about the division of bandwidth in the high-level trigger. “We were faced with enormous problems: synchronisation crashes, event processing that was taking seconds and getting stuck…,” recalls Jacobsson. “Some run numbers, such as 1179, still send shivers down the back of my spine.” By September, however, they had demonstrated that LHCb was capable of running with much higher pile-up than anybody had thought possible.

No machine has ever been so stable in its operational mode

Rolf Lindner

Necessity was the mother of invention. In 2011 and 2012 LHCb introduced a feedback system that maintains a manageable luminosity during each fill by increasing the overlap between the colliding beams as protons “burn out” in collisions, and the brightness of the bunches decreases. When Jacobsson and his colleagues mentioned it to the CERN management in September 2010, the then director of accelerators, Steve Myers, read the riot act, warning of risks to beam stability, recalls Jacobsson. “But since I had a few good friends at the controls of the LHC, we could carefully and quietly test this, and show that it produced stable beams. This changed life on LHCb completely. The effect was that we would have one stable condition throughout every fill for the whole year – perfect for precision physics.”

Initially, LHCb had planned to write events at 200 Hz, recalls Rolf Lindner, the experiment’s longtime technical coordinator, but by the end of Run 1, LHCb was collecting data at up to 10 kHz, turning offline storage, processing and “physics stripping” into an endless fire fight. Squeezing every ounce of performance out of the LHC generated greater data volumes than anticipated by any of the experiments, and even stories (probably apocryphal) of shifters running down to local electronics stores to buy data discs because they were running out of storage. “The LHC would run for several months with stable beams for 60% of every 24 hours in a day,” says Lindner. “No machine has ever been so stable in its operational mode.”

Engineering all-stars

The eyes of the world turned to ATLAS and CMS on 4 July 2012 as the collaborations announced the discovery of a new boson – an iconic moment to validate countless hours of painstaking work by innumerable physicists, engineers and computer scientists, which is nevertheless representative of just one of a multitude of physics insights made possible by the LHC experiments (see LHC at 10: the physics legacy). The period running up to the euphoric Higgs discovery had been smooth for all except LHCb, who had to scramble to disprove unfounded suggestions that their dipole magnet, occasionally reversed in field to reduce systematic uncertainties, was causing beam instabilities. But new challenges would shortly follow. Chief among several hair-raising moments in CMS was the pollution of the magnet cryogenic system in 2015 and 2016, which caused instability in the detector’s cold box and threatened the reliable operation of the superconducting solenoid surrounding the tracker and calorimeters. The culprit turned out to be superfluous lubricant – a mere half a litre of oil, now in a bottle in Ball’s office – which clogged filters and tiny orifices crucial to the cyclical expansion cycle used to cool the helium. “By the time we caught on to it, we hadn’t just polluted the cold box, we had polluted the whole of the distribution from upstairs to downstairs,” he recalls, launching into a vivid account of seat-of-the-pants interventions, and also noting that the team turned their predicament into an opportunity. “With characteristic physics ingenuity, and faced with spoof versions of the CMS logo with straightened tracks, we exploited data with the magnet off to calibrate the calorimeters and understand a puzzling 750 GeV excess in the diphoton invariant mass distribution,” he says.

Now I look back on the cryogenic crisis as the best project I ever worked on at CERN

Austin Ball
LHCb’s dipole magnet

With resolute support from CERN, bold steps were taken to fix the problem. It transpired that slightly-undersized replaceable filter cartridges were failing to remove the oil after it was mixed with the helium to lubricate screw-turbine compressors in the surface installation. “Now I look back on the cryogenic crisis as the best project I ever worked on at CERN, because we were allowed to assemble this cross-departmental superstar engineering team,” says Ball. “You could ask for anyone and get them. Cryogenics experts, chemists and mechanical engineers… even Rolf Heuer, then the Director-General, showed up frequently. The best welders basically lived in our underground area – you could normally only see their feet sticking out from massive pipework. If you looked carefully you might spot a boot. It’s a complete labyrinth. That one will stick with me for a long time. A crisis can be memorable and satisfying if you solve it.”

Heroic efforts

During the long shutdown that followed, the main task for LHCb was to exchange a section of beryllium beam pipe in which holes had been discovered and meticulously varnished over in haste before being used in Run 1. At the same time, right at the end of an ambitious and successful consolidation and improvement programme, CMS suffered the perils of extraordinarily dense circuit design when humid air condensed onto cold silicon sensor modules that had temporarily been moved to a surface clean room. 10% of the pixels short-circuited when it was powered up again, and heroic efforts were needed to re-manufacture replacements and install them in time for the returning LHC beams. Meanwhile, wary of deteriorating optical readout, ATLAS refurbished their pixel-detector cabling, taking electronics out of the detector to make it serviceable and inserting a further inner pixel layer just 33 mm from the beam pipe to up their b-tagging game. The bigger problem was mechanical shearing of the bellows that connect the cryostat of one of the end-cap toroids to the vacuum system – the only problem experienced so far with ATLAS’s ambitious magnet system. “At the beginning people speculated that with eight superconducting coils, each independent from the others, we would experience one quench after another, but they have been perfect really,” confirms Pontecorvo. Combined with the 50-micron alignment of the 45 m-long muon detector, ATLAS has exceeded the design specifications for resolving the momentum of high-momentum muons – just one example of a pattern repeated across all the LHC detectors.

As the decade wore on, the experiments streamlined operations to reach unparalleled performance levels, and took full advantage of technical and end-of-year stops to keep their detectors healthy. Despite their very high-luminosity environments, ATLAS and CMS pushed already world-beating initial data-taking efficiencies of around 90% beyond the 95% mark. “ATLAS and CMS were designed to run with an average pile-up of 20, but are now running with a pile-up of 60. This is remarkable,” states Pontecorvo.

Accelerator rising

At 10, with thousands of physics papers behind them and many more stories to tell, the LHC experiments are as busy as ever, using the second long shutdown, which is currently underway, to install upgrades, many of which are geared to the high-luminosity LHC (HL-LHC) due to operate later this decade. Many parts are being recycled, for example with ALICE’s top-performing TPC chambers donated to Fermilab for the near detector of the DUNE long-baseline neutrino-oscillation experiment. And major engineering challenges remain. A vivid example is that the LHC tunnel, carved out of water-laden rock 30 years ago, is rising up, while the experiments – particularly the very compact CMS, which has a density almost the same as rock – remain fixed in place, counterbalancing upthrust due to the removed rock with their weight. CMS faces the greatest challenge due to the geology of the region, explains Ball. “The LHC can use a corrector magnet to adjust the level of the beam, but there is a risk of running out of magnetic power if the shifts are big. Just a few weeks ago they connected a parallel underground structure for HL-LHC equipment, and the whole tunnel went up 3 mm almost overnight. We haven’t solved that one yet.”

Most of all, it is important to acknowledge the dedication of the people who run the experiments

Ludovico Pontecorvo

Everyone I interviewed agrees wholeheartedly on one crucial point. “Most of all, it is important to acknowledge the dedication of the people who run the experiments,” explains Pontecorvo of ATLAS, expressing a sentiment emphasised by his peers on all the experiments. “These people are absolutely stunning. They devote their life to this work. This is something that we have to keep and which it is not easy to keep. Unfortunately, many feel that this work is undervalued by selection committees for academic positions. This is something that must change, or our work will finish – as simple as that.”

Pontecorvo hurries out of the door at the end of our early-morning interview, hastily squeezed into a punishing schedule. None of the physicists I interviewed show even a smidgen of complacency. Ten years in, the engineering and technological marvels that are the four biggest LHC experiments are just getting started.

Bang, beam, bump, boson

CERN Control Centre on 30 March 2010

The start-up of the LHC was an exciting time and the culmination of years of work, made manifest in the process of establishing circulating beams, ramping, squeezing and producing the first collisions. The two major events of the commissioning era were first circulating beams on 10 September 2008 and first high-energy collisions on 30 March 2010. For both of these events the CERN press office saw fit to invite the world’s media, set up satellite links, arrange numerous interviews and such. Combined with the background attention engendered by the LHC’s potential to produce miniature black holes and the LHC’s supporting role in the 2009 film Angels and Demons, the LHC enjoyed a huge amount of coverage, and in some sense became a global brand in the process (CERN Courier September 2018 p44).

The LHC is one of biggest, most complex and powerful instruments ever built. The large-scale deployment of the main two-in-one dipoles and quadrupoles cooled to 1.9 K by superfluid helium is unprecedented even in particle physics. Many unforeseen issues had to be dealt with in the period before start-up. A well-known example was that of the “collapsing fingers”. In the summer of 2007, experts realised that the metallic modules responsible for the electrical continuity between different vacuum pipe sections in the magnet interconnects could occasionally become distorted as the machine was warmed up. This distortion led to a physical obstruction of the beam pipe. The solution was surprisingly low-tech: to blow a ping-pong-sized ball fitted with a 40 MHz transmitter through the pipes and find out where it got stuck.

First turns

The commissioning effort was clearly punctuated by the electrical incident that occurred during high-current tests on 19 September 2008, just nine days after the success of “first beam day”. Although the incident was a severe blow to CERN and the LHC community, it did provide a hiatus of which full use was made (see A labour of love). The LHC and experiments returned at “an unprecedented state of readiness” and beam was circulated again on 20 November 2009. Rapid progress followed. Collisions with stable beam conditions were quickly established at 450 GeV, and a ramp to the maximum beam energy at the time (1.18 TeV, compared to the Tevatron’s 0.98 TeV) was successfully achieved on 30 November. All beam-based systems were at least partially commissioned and LHC operators managed to start to master the control of a hugely complex machine.

After the 2009 Christmas technical stop, which saw continued deployment of the upgraded quench-protection system that had been put in place following the 2008 incident, commissioning started again in the new year. Progress was rapid, with first colliding beams at 3.5 TeV being established on 30 March 2010. It was a tense day in the control room with the scheduled collisions delayed by two unsuccessful ramps and all under the watchful eye of the media. In the following days, squeeze-commissioning successfully reduced the β* parameter (which is related to the transverse size of the beam at the interaction points) to 2.0 m in ATLAS and CMS. Stable beams were declared, and the high-energy exploitation of the four main LHC experiments could begin in earnest.

Tales from Run 1

Essentially 2010 was devoted to commissioning and then establishing confidence in operational procedures and the machine protection system before starting the process of ramping up the number of bunches in the beam.

In June the decision was taken to go for bunches with nominal population (1.15 × 1011 protons), which involved another extended commissioning period. Up to this point, only around one fifth of the nominal bunch population was used. To further increase the number of bunches, the move to bunch trains separated by 150 ns was made and the crossing angles spanning the experiments’ insertion regions brought in. This necessitated changes to the tertiary collimators and a number of ramps and squeezes. We then performed a carefully phased increase in total intensity. The proton run finished with beams of 368 bunches of around 1.2 × 1011 protons per bunch, and a peak luminosity of 2.1 × 1032 cm–2s–1, followed by a successful four-week long lead–lead ion run.

The initial 50 and 25 ns intensity ramp-up phase was tough going

In 2011 it was decided to keep the LHC beam energy at 3.5 TeV, and to operate with 50 ns bunch spacing – opening the way to significantly more bunches per beam. Following several weeks of commissioning, a staged ramp-up in the number of bunches took us to a maximum of 1380 bunches. Reducing the transverse size of the beams delivered by the injectors and gently increasing the bunch population resulted in a peak luminosity of 2.4 × 1033 cm–2s–1 and some healthy luminosity-delivery rates. Following a reduction in β* in ATLAS and CMS from 1.5 m to 1.0 m, and further gradual increases in bunch population, the LHC achieved a peak luminosity of 3.8 × 1033 cm–2s–1 – well beyond expectations at the start of the year – and delivered a total of around 5.6 fb–1 to both ATLAS and CMS.

2012 was a production year at an increased beam energy of 4 TeV, with 50 ns bunch spacing and 1380 bunches. A decision to operate with tighter collimator settings allowed a more aggressive squeeze to a β* of 0.6 m, and the peak luminosity was quickly close to its maximum for the year, followed by determined and long-running attempts to improve peak performance. Beam instabilities, although never debilitating, were a reoccurring problem and there were phases when they cut into operational efficiency. By the middle of the year another 6 fb–1 had been delivered to both ATLAS and CMS. Combined with the 2011 dataset, this paved the way for the announcement of the Higgs discovery on 4 July 2012. It was a very long operational year and included the extension of the proton–proton run until December, resulting in the shift of a four-week-long proton–lead run to 2013. Integrated-luminosity rates were healthy at around the 1 fb–1 per-week level and this allowed a total for the year of about 23 fb–1 to be delivered to both ATLAS and CMS.

Five phrases LHC operators learned to love

A treated image of the LHC beam pipes

Single-event effects

Caused by beam-induced radiation to tunnel electronics, these were a serious cause of inefficiency in the LHC’s early days. However, the problem had been foreseen and its impact was considerably reduced following a sustained programme of mitigation measures – including shielding campaigns prior to the 2011 run.

Unidentified falling objects

Microscopic particles of the order of 10 microns across, which fall from the top of the vacuum chamber or beam screen, become ionised by collisions with circulating protons and are then repelled by the positively charged beam. While interacting with the circulating protons they generate localised beam loss, which may be sufficient to dump the beam or, in the limit, cause a quench. During the first half of 2015 they were a serious issue, but happily they have subsequently conditioned down in frequency.

Beam-induced heating

This is where regions of the LHC near the beam become too warm, and has been a long-running issue. Essentially, all cases have been local and, in some way, due to non-conformities either in design or installation. Design problems have affected the injection protection devices and the mirror assemblies of the synchrotron radiation telescopes, while installation problems have occurred in a low number of vacuum assemblies. These issues have all been addressed and are not expected to be a problem in the long term.

Beam instabilities

This was an interesting problem that occasionally dogged operations. Operations with 25 ns bunch spacing and lower bunch population have meant that intrinsically instabilities should have been less of an issue. However, high electron cloud (see “Electron cloud effects”) also proved to be a driver and defence mechanisms were deployed in the form of high-chromaticity, high-octupole field strength, and the all-important transverse damper system.

Electron cloud effects

These result from an avalanche-like process in which electrons from gas ionisation or photo-emission are accelerated in the electromagnetic field of the beam and hit the beam-chamber walls with energies of a few hundreds of eV, producing more electrons. This can lead to beam oscillations and blow-up of the proton bunches. “Scrubbing”, the deliberate invocation of high electron cloud with beam, provides a way to reduce or suppress subsequent electron cloud build-up. Extensive scrubbing was needed for 25 ns running. Conditioning thereafter has been slow and the heat load from electron cloud to cryogenics system remained a limitation in 2018.

To Run 2 and beyond

In early 2015 the LHC emerged from “long-shutdown one”. The aims were to re-commission the machine without beam following major consolidation and upgrades, and from a beam perspective to safely establish operations at 6.5 TeV with 25 ns bunch spacing and around 2800 bunches. This was anticipated to be more of a challenge than previous operations at 4 TeV with 50 ns beams. Increased energy implies lower quench margins and thus lower tolerance to beam loss, with hardware pushed closer to maximum with potential knock-on effects to availability. A 25 ns beam was antici­pated to have significantly higher electron-cloud effects (see “Five phrases LHC operators learned to love” box) than that experienced with 50 ns; in addition, there was a higher total beam current and higher intensity per injection. All of these factors came into play to make 2015 a challenging year.

Delivered integrated luminosity

The initial 50 and 25 ns intensity ramp-up phase was tough going and had to contend with a number of issues, including earth faults, unidentified falling objects, an unidentified aperture restriction in a main dipole, and radiation affecting specific electronic components in the tunnel. Nonetheless, the LHC was able to operate with up to 460 bunches and deliver some luminosity to the experiments, albeit with poor efficiency. The second phase of the ramp-up, following a technical stop at the start of September, was dominated by the electron–cloud-generated heat load and the subsequent challenge for the cryogenics, which had to wrestle with transients and operation close to their cooling power limits. The ramp-up in number of bunches was consequently slow but steady, culminating in the final figure for the year of 2244 bunches per beam. Importantly, the electron cloud generated during physics operations at 6.5 TeV served to slowly condition the surface of the beam screens in the cold sectors and so reduce the heat load at a given intensity. As time passed, this effect opened a margin for the use of more bunches.

The overall machine availability remained respectable with around 32% of the scheduled time spent in “stable beams” mode during the final period of proton–proton physics from September to November. By the end of the 2015 proton run, 2244 bunches per beam were giving peak luminosities of 5.5 × 1033 cm–2s–1 in the high-luminosity experiments, with a total integrated luminosity of around 4 fb–1 delivered to both ATLAS and CMS. Levelled luminosities of 3 × 1032 cm–2s–1 in LHCb and 5 × 1030 cm–2s–1 in ALICE were provided throughout the run.

A luminous future

Following an interesting year, 2016 was the first full year of exploitation at 6.5 TeV. The beam size at the interaction point was further reduced (β* = 0.4 m) and the LHC design luminosity of 1034 cm–2s–1 was achieved. Reasonable machine availability allowed a total of 40 fb–1 to be delivered to both ATLAS and CMS. 2017 saw a further reduction in beam size at the interaction point (β* = 0.3 m), which, together with small beams from the injectors, gave a peak luminosity of 2.2 × 1034 cm–2s–1. Despite the effects of an accidental ingress of air into the beam vacuum during the winter technical stop, around 50 fb–1 was delivered to ATLAS and CMS.

Not only can a 27 km superconducting collider work, it can work well!

2018 essentially followed the set-up of 2017 with a squeeze to β* = 0.3 m in ATLAS and CMS. The effects of the air ingress lingered on, limiting the maximum bunch intensity to approximately 1.2 × 1011. Despite this, the peak luminosity was systematically close to 2 × 1034 cm–2s–1 and around 63 fb–1 was delivered to ATLAS and CMS. Somewhat more integrated luminosity was possible thanks to the novel luminosity levelling strategy pursued. This involved continuous adjustment of the crossing angle in stable beams, and for the first time the LHC dynamically changed the optics in stable-beams mode, with β* reduced from 0.30 to 0.27 to 0.25 m while colliding. The year finished with a very successful lead–ion run, helped by the impressive ion delivery from the injectors. In December 2018 the machine entered long-shutdown two, recovery from which is scheduled in 2021.

It is nearly 12 years since first beam, and 10 since first high-energy operations at the LHC. The experience has shown that, remarkably, not only can a 27 km superconducting collider work, it can work well! This on the back of some excellent hardware system performance, impressive availability, high beam quality from the injectors and some fundamental operational characteristics of the LHC. Thanks to the work of many, many people over the years, the LHC is now well understood and continues to push our understanding of how to operate high-energy hadron colliders and to surpass expectations. Today, as plans for Run 3 take shape and work advances on the challenging magnets needed for the high-luminosity LHC upgrade, things promise to remain interesting.

bright-rec iop pub iop-science physcis connect