Topics

The Higgs enigma: celebrating 10 years of discovery

Artistic impression of the Brout–Englert–Higgs field

Ten years ago, a few small bumps in ATLAS and CMS data confirmed a 48 year-old theoretical prediction, and particle physics hasn’t been the same since. Behind those sigmas was the hard work, dedication, competence and team spirit of thousands of experimentalists and accelerator physicists worldwide. Naturally it was a triumph for theory, too. Peter Higgs, François Englert, Carl Hagen and Gerald Guralnik received a standing ovation in the CERN auditorium on 4 July 2012, although Higgs insisted it was a day to celebrate experiment, not theory. The Nobel prize for Englert and Higgs came a year later. Straying from tradition for elementary-particle discoveries, the citation explicitly acknowledged the experimental effort of ATLAS and CMS, the LHC and CERN. 

The implications of the Higgs-boson discovery are still being understood. Ten years of precision measurements have shown the particle to be consistent with the minimal version required by the Standard Model. Combined with the no-show of non-Standard Model particles that were expected to accompany the Higgs, theorists are left scratching their heads. As we celebrate the collective effort of high-energy physicists in discovering the Higgs boson and determining its properties, another intriguing journey has opened up.

Marvelously mysterious 

As “a fragment of vacuum” with the barest of quantum numbers, the Higgs boson is potentially connected to many open questions in fundamental physics. The field from which it hails governs the nature of the electroweak phase transition in the early universe, which might be connected with the observed matter–antimatter asymmetry; as the only known elementary scalar particle, it could serve as a portal to other, hidden sectors relevant to dark matter; its couplings to matter particles — representing a new interaction in nature — may hold clues to the puzzling hierarchy of fermion masses; and its interactions with itself have implications for the ultimate stability of the universe. 

Nobody knows what the Higgs boson has in store

With the LHC and its high-luminosity upgrade, physicists have 20 years of Higgs exploration to look forward to. But to fully understand the shape of the Brout–Englert–Higgs potential, the couplings of the Higgs boson to Standard Model particles and its possible connections to new physics, a successor collider will be needed. It is fascinating to picture future generations of particle physicists working as one with astroparticle physicists, cosmologists, quantum technologists and others to fill out the details of this potential new vista, with colliders driving progress alongside astrophysical, cosmological and gravitational-wave observatories. Future colliders aren’t just about generating knowledge, argues Anna Panagopoulou of the European Commission, but are “moonshots” delivering a competitive edge in technology, innovation, education and training — opening adventures that inspire young people to enter science in the first place.

Nobody knows what the Higgs boson has in store. Perhaps further studies will confirm the scenario of a Standard-Model Higgs and nothing else. The sheer number and profundity of known unknowns in the universe would suggest otherwise, think theorists. The good news is that, in the Higgs boson, physicists have clear measurement targets – and in principle the necessary theoretical and experimental machinery – to explore such mysteries, building upon the events of 4 July 2012 to reach the next level of understanding in fundamental physics. 

Engines of knowledge and innovation

One of a kind

The search for the Higgs boson is the kind of adventure that draws many young people to science, even if they go on to work in more applied areas. I first set out to become a nuclear physicist, and even applied for a position at CERN, before deciding to specialise in electrical engineering and then moving into science policy. Today, my job at the European Commission (EC) is to co-create policies with member states and stakeholders to shape a globally competitive European research and innovation system. 

Large research infrastructures (RIs) such as CERN have a key role to play here. Having visited CERN for the first time last year, I was impressed not just by the basic research but also by the services that CERN provides the collaborations, its relationships with industry, and its work in training and educating young people. It is truly an example of what it means to collaborate on an international level, and it helped me understand better the role of RIs in research and innovation. 

Innovation is one of three pillars of the EC’s €95.5 billion Horizon Europe programme for the period 2021–2027. The first pillar is basic science, and the second concerns applied research and knowledge diffusion. Much of the programme’s focus is “missions” geared to societal challenges such as soil, climate and cancer, driven by the UN’s 2030 Sustainable Development Goals. So where does a laboratory like CERN fit in? Pillar one is the natural home of particle physics, where there is well established support via European Research Council grants, Marie Skłodowska-Curie fellowships and RI funding. On the other hand, the success of the Horizon Europe missions relies on the knowledge and new technologies generated by the RIs. 

Anna Panagopoulou

We view the role of RIs as driving knowledge and technology, and ensuring it is transferred in Europe – acting as engines in a local ecosystem involving other laboratories and institutes, hospitals and schools, attracting the best people and generating new labour forces. COVID-19 is a huge social challenge that we also managed to address using basic research, RIs and opening access to data. This is a clear socioeconomic impact of current research and also data collected in the past.

Open science is a backbone of Horizon Europe, and an area where particle physics and CERN in particular are well advanced. I chair the governance board of the European Open Science Cloud, a multi-disciplinary environment where researchers can publish, find and re-use data, tools and services, in which CERN has a long-standing involvement.

Indeed, the EC has established a very strong collaboration with CERN across several areas. Recently we have been meeting to discuss the proposed Future Circular Collider (FCC). The FCC is worthwhile not just to be discussed but supported, and we are already doing so via significant projects. We are now discussing possibilities in Horizon Europe to support more technological aspects, but clearly EU money is not enough. We need commitment from member states, so there needs to be a political decision. And to achieve that we need a very good business plan that turns the long-term FCC vision into clearly defined short-term goals and demonstrates its stability and sustainability. 

Societal impact

Long-term projects are not new to the EC: we have ITER, for example, while even the neutrality targets for the green-deal and climate missions are for 2050. The key is to demonstrate their relevance. There is sometimes a perception that people doing basic research are closed in their bubble and don’t realise what’s going on in the “real” world. The space programme has managed to demonstrate over the years that there are sufficient applications providing value beyond its core purpose. Nowadays, with issues of defence, security and connectivity rising up political agendas, researchers can always bring to the table that their work can help society address its needs. For big RIs such as the FCC we need to demonstrate first: what is the added value, even if it’s not available today? Why is it important for Europe? And what is the business plan? The FCC is not a typical project. To attract and convince politicians and finance ministers of its merits, it has to be presented in terms of its uniqueness. 

The FCC brings to mind the Moon landings

The FCC brings to mind the Moon landings. Contrary to popular depictions, this was a long-term project that built on decades of competitive research from different countries. Yes, it was a period during the Cold War, but it was also the basis of fruitful collaboration. If we don’t dare to spend money on projects that bring us to the future then we lose, as Europe, a competitive advantage.

The Higgs after LHC

Many of the most arbitrary aspects of the Standard Model of particle physics (SM) are intimately connected to the scalar sector of the theory. The SM comprises just one scalar particle, the Higgs boson, and assumes a specific scalar potential (the famous “Mexican hat”) to define the dynamics of electroweak (EW) interactions. But the fact that the Higgs boson acquires a non-zero vacuum expectation value that defines the mass scale of EW interactions (around 100–200 GeV) is assumed, not explained, by the SM. Indeed, why the Higgs-boson mass is constrained to be at the EW scale, while quantum corrections should push it to much higher values (the so-called naturalness problem, see Naturalness after the Higgs), is not justified by any symmetry of the SM. At the same time, the SM assumes that fermion masses are generated via arbitrary Yukawa-type interactions with the scalar field but it does not explain the hierarchy of couplings or masses that we observe, nor the specific flavour structure that arises from the presence of just one scalar field. 

Future colliders are vital to push the precision Higgs programme to the next level

The scalar sector of the SM may therefore be seen as a messenger of a more fundamental theory that replaces the SM at energies beyond the EW scale and turns apparent arbitrariness into logical consequences. After all, the mechanism of EW symmetry breaking as realised in the SM via the Brout–Englert–Higgs (BEH) field is just the simplest possible way to generate massive EW gauge bosons and fermions while preserving gauge symmetry. The scalar potential could be more complicated, for example involving multiple scalar fields, as is common in many beyond-the-SM (BSM) theories. This would result in a richer pattern of stable and metastable minima and influence the nature of the EW phase transition. A first-order phase transition, together with extra sources of CP violation beyond what is implied by the SM, could explain the origin of the matter–antimatter asymmetry of the universe via EW baryogenesis (see Electroweak baryogenesis). Understanding the origin of the EW scale is thus key to connecting very different realms of particle physics and cosmology, and the question we face while we look into the future of collider physics. 

Game changer

The discovery of the Higgs boson during Run 1 of the LHC has been a game changer in the exploration of new physics beyond the EW scale. The measurement of the Higgs-boson mass has added the last missing input parameter to precision global fits of the SM, which now provide a very powerful tool to constrain BSM scenarios. Thanks to an unprecedented level of precision reached in both theory and experiment, the measurement of Higgs-boson couplings to EW gauge bosons (W, Z) and to the first two generations of quarks and leptons (t, b, τ, µ) from Run 2 data has already constrained their deviations from SM expectations to within 5–20%, with the best accuracy reached for the couplings to the gauge bosons. Based on these results, the High-Luminosity LHC (HL-LHC) is projected to constrain the effects of new physics on Higgs-boson couplings to EW gauge bosons to 1–2%, and to heavy quarks and fermions to 3–5%. If no anomalies are found, this level of accuracy will push the lower bound on the scale of new physics into the TeV ballpark. Vice versa, the detection of possible anomalies may point to the presence of new physics at the TeV scale, possibly just around the corner.

An ATLAS di-Higgs event

On the other hand, testing the SM scalar potential will still be challenging even during the HL-LHC era. The shape of the BEH potential can be tested by measuring the Higgs-boson self-interactions corresponding to its cubic and quartic terms. In the SM, these interactions are strictly proportional to the Higgs-boson mass via the vacuum expectation value of the BEH field. Deviations from the SM are searched for via Higgs pair production and radiative corrections to single-Higgs measurements. Although the LHC and HL-LHC promise to provide evidence for di-Higgs production, the extraction of the Higgs self-coupling from such measurements will be statistically limited.

Future colliders

Future colliders are vital to push the precision Higgs programme to the next level. While the type and concept of the next collider is yet to be decided, all proposed facilities would deliver a huge number of Higgs bosons over their lifetime, operating at different and well targeted centre-of-mass energies (see “At a glance” figure). They can complement one another and, staggered over a period of the next few decades, provide the missing elements of the EW puzzle.

Among future lepton colliders under study, circular e+e colliders (CEPC, FCC-ee) are expected to operate at lower energies between 90–350 GeV with very high luminosities, while linear e+e colliders (ILC, C3, CLIC) offer both low- and high-energy phases generally with slightly lower luminosities. Combined with data from the HL-LHC, these “Higgs factories” would enable the SM, including most Higgs couplings, to be stress-tested below the per-cent level and in cases at or below the per-mille level. In particular, FCC-ee operating at the s-channel Higgs resonance (125 GeV) has the capability to provide bounds on couplings as small as the electron Yukawa coupling, while linear e+e colliders operating at 550–600 GeV and above could substantially improve on the top-quark Yukawa coupling with respect to the HL-LHC. A possible muon collider, operated either as a Higgs factory at 125 GeV or as a high-energy discovery machine at 3–10 TeV, is estimated to reach similar precisions on Higgs couplings to other particles as e+e machines. 

Uncertainties on the Higgs self-coupling

Finally, high-energy lepton colliders (ILC 1000, CLIC 3000 and a 3–30 TeV muon collider) and very high-energy hadron colliders (FCC-hh at 100 TeV) would reach enough statistics and energy to measure the Higgs self-coupling and investigate the nature of the BEH potential, either via di-Higgs or single-Higgs production (see “Self-coupling” figure). With an aggressive Higgs physics programme they may also reach enough sensitivity to probe the cubic and quartic terms in the BEH potential separately. 

Almost half a century after it was predicted, the LHC delivered the Higgs boson in spectacular style on 4 July 2012. Over the next 15–20 years, the machine and its luminosity upgrade will continue to enable ATLAS and CMS to make great strides in understanding the Higgs boson’s properties. But to fully exploit the discovery of the Higgs boson and explore its mysterious relation to new physics beyond the EW scale, we will need a successor collider.

Through the Higgs portal

Referring to the field equation of general relativity Rμν – ½ Rgμν = κTμν , Einstein is reported to have said that the left-hand side, constructed from space–time curvature, is “a palace of gold”; while the right-hand side, which parameterises the energy and momentum of matter, is by comparison “a hovel of wood”. Present-day physics has arrived at much more concrete ideas about the right-hand side than were available to Einstein. It is fair to say that some of it has come to look quite palatial, and fully worthy to stand alongside the left-hand side. These are the terms that involve field kinetic energy and gauge bosons, as described by the Standard Model (SM). Their form follows logically, within the framework of relativistic quantum field theory, directly from the principles of local gauge symmetry and relativity. Mathematically, they also speak the same geometric language as the right-hand side. The gauge bosons are avatars of curvature in “internal spaces”, similar to how gravitons are the avatars of space–time curvature. Internal spaces parameterise ways in which fields can vary – and thus, in effect, move – independently of ordinary motion in space–time. In this picture, the strong, weak and electromagnetic interactions arise from the influence of internal space curvature on internal space motion, similar to how gravity arises from the influence of space–time curvature on space–time motion.

The Higgs particle is the only portal connecting normal matter to such phantom fields

The other contributions to Tμν, all of which involve the Higgs particle, do not yet reach that standard. We can aspire to do better! They are of three kinds. First, there are the many Yukawa-like terms from which quark and lepton masses and mixings arise. Then there is the Higgs self-coupling and finally a term representing its mass. These contributions to Tμν contain almost two dozen dimensionless coupling parameters that present-day theory does not enable us to calculate or even much constrain. It is therefore important to investigate experimentally, through quantitative studies of Higgs-particle properties and interactions, whether this ramshackle structure describes nature accurately. 

Higgs potential

The Higgs boson is special among the elementary particles. As the quantum of a condensate that fills all space, it is metaphorically “a fragment of vacuum”. Speaking more precisely, the Higgs particle has no spin, no electric or colour charge and, at the level of strong and electromagnetic interactions, normal charge conjugation and parity. Thus, it can be emitted singly and without angular momentum barriers, and it can decay directly into channels free of colour and electromagnetically charged particles, which might otherwise be difficult to access. For these and other, more technical, reasons, the Higgs particle has the potential to reveal new physical phenomena of several kinds. 

A unique aspect of the Higgs mass term is especially promising for revealing possible shortcomings in the SM. In quantum field theory, an important property of an interaction is the “mass dimension” of the operator that implements it – a number that in an important sense indicates its complexity. Scalar and gauge fields have mass dimension 1 as do space–time derivatives, whereas fermion fields have mass dimension 3/2. More complicated operators are built up by multiplying these, and the mass dimension of a product is the sum of the mass dimensions of its factors. Interactions associated with operators whose mass dimension is greater than 4 are problematic because they lead to violent quantum fluctuations and mathematical divergences. Whereas all the other terms in the SM Lagrangian arise from operators of mass dimension 4, the Higgs mass term has mass dimension 2. Thus it is uniquely open to augmentation by couplings to hypothetical new SU(3) × SU(2) × U(1) singlet scalar fields, because the mass dimension of the augmented interaction can be 3 or 4 – i.e. still “safe”. The Higgs particle is the only portal connecting normal matter to such phantom fields.

Dark matter map

Why is this an interesting observation? There are three main reasons: two broadly theoretical, one pragmatic. First of all, the particles that are generally considered part of the SM carry a variety of charge assignments under the gauge groups SU(3) × SU(2) × U(1) that govern the strong and electroweak interactions. For example, the left-handed up quark is charged under all three groups, while the right-handed electron carries only U(1) hypercharge. Thus it is not only logically possible, but reasonably plausible, that there could be particles that are neutral under all three groups. Such phantom particles might easily escape detection, since they do not participate in the strong or electroweak interactions. Indeed, there are several examples of well-motivated candidate particles of that kind. Axions are one. Since they are automatically “dark” in the appropriate sense, phantom particles could contribute to the astronomical dark matter, and might even dominate it, as model-builders have not failed to notice. Also, many models of unification bring in scalar fields belonging to representations of a unifying gauge group that contains SU(3) × SU(2) × U(1) singlets, as do models with supersymmetry. Only phantom scalars are directly accessible through the Higgs portal, but phantoms of higher spin, including right-handed neutrinos, could cascade from real or virtual scalars.

Mysterious values

Second, the empirical value of the Higgs mass term is somewhat mysterious and even problematic, given that quantum corrections should push it to a value many orders of magnitude higher. This is the notorious “hierarchy problem” (see Naturalness after the Higgs). Given this situation, it seems appropriate to explore the possibility that part (or all) of the effective mass-term of the SM Higgs particle arises from more fundamental couplings upon condensation of SU(3) × SU(2) × U(1) singlet scalar fields, i.e. the emergence of a non-zero space-filling field, as occurs in the Brout–Englert–Higgs mechanism.

The portal idea leads to concrete proposals for directions of experimental exploration

Third, the portal idea leads to concrete proposals for directions of experimental exploration. These are of two basic kinds: one involves the observed strength of conventional Higgs couplings, the other the kinematics of Higgs production and decay. Couplings of the Higgs field to singlets that condense will lead to mixing, altering numerical relationships among Higgs-particle couplings and masses of gauge bosons, and of fermions from their minimal SM values. Also, the Higgs-field couplings to gauge bosons and fermions will be divided among two or more mass eigenstates. Since existing data indicates that deviations from the minimal model are small, the coupling of normal matter to the “mostly but not entirely” singlet pieces could be quite small, perhaps leading to very long lifetimes (as well as small production rates) for those particles. Whether or not the phantom particles contribute significantly to cosmological dark matter, they will appear as missing energy or momentum accompanying Higgs particle decay or, through Bremsstrahlung-like processes, when they are produced. 

We introduced the term “Higgs portal” to describe this circle of ideas in 2006, triggering a flurry of theoretical discussion. Now that the portal is open for business, and with larger data samples in store at the LHC, we can think more concretely about exploring it experimentally.

The thrill of the chase

Fabiola Gianotti and Joe Incandela during their 4 July 2012 presentations

At around 10:30 a.m. on 4 July 2012, two remarkable feats of theoretical and experimental physics reached an apex in the CERN auditorium. One was the work of a few individuals using the most rudimentary of materials, the other a global endeavour involving thousands of people and the world’s most powerful collider. Forty-eight years after it was predicted, the CMS and ATLAS collaborations presented conclusive evidence for the existence of a new elementary particle, the Higgs boson, the cornerstone of the electroweak Standard Model. 

“It took us several years to recover,” says CMS experimentalist Chiara Mariotti, who was co-convener of the collaboration’s Higgs group at the time. “For me there was a strong sense of ‘Higgs blues’ afterwards! On the other hand, the excitement was also productive. Immediately after the discovery we managed to invent a new method to measure the Higgs width, with a precision more than 200 times better than what we were thinking – a real breakthrough.”

Theoretically, the path to the Higgs boson had been paved by the early 1970s, building on foundations laid by the pioneers of quantum field theory and superconductivity. When Robert Brout and François Englert, and independently Peter Higgs, published their similarly titled papers on broken symmetry and the mass of gauge bosons in 1964, nobody took much notice. One of Higgs’s manuscripts was even rejected by an editor based at CERN. The profound consequences of the Brout–Englert–Higgs (BEH) mechanism – that the universe is pervaded by a scalar field responsible for breaking electroweak symmetry and giving elementary particles their mass (see “The Higgs, the universe and everything” panel) – only caught wider attention after further Nobel-calibre feats by Steven Weinberg, who incorporated the BEH mechanism into electroweak theory developed also by Abdus Salam and Sheldon Glashow, and by Gerard ’t Hooft and Martinus Veltman, who proved that the unified theory was mathematically consistent and capable of making testable predictions (see A triumph for theory). 

Over to CERN 

The first bridge linking the BEH mechanism to the real world was sketched out in CERN’s theory corridors in the form of a 50-page-long phenomenological profile of the Higgs boson by John Ellis, Mary Gaillard and Dimitri Nanopoulos published in 1976. The discovery of neutral currents in 1973 by Gargamelle at CERN, and of the charm quark at Brookhaven and SLAC in 1974, had confirmed that the Standard Model was on the right track. Despite their conviction that something like the Higgs boson had to exist, however, Ellis et al. ended their paper on a cautionary, somewhat tongue-in-cheek note: “We apologise to experimentalists for having no idea what is the mass of the Higgs boson… and for not being sure of its couplings to other particles, except that they are probably all very small. For these reasons we do not want to encourage big experimental searches for the Higgs boson, but we do feel that people performing experiments vulnerable to the Higgs boson should know how it may turn up”. 

As it turned out, discovering and measuring the electroweak bosons would drive three major projects at CERN spanning three decades: the SPS proton–antiproton collider, LEP and the LHC. Following Carlo Rubbia and Simon van der Meer’s ingenious modification of the SPS to collide protons and antiprotons, greatly increasing the available energy, the UA1 and UA2 collaborations confirmed the existence of the W boson on 25 January 1983. The discovery of the slightly heavier Z boson came a few months later. The discoveries made the case for the Higgs boson stronger, since all three bosons hail from the same scalar field (see panel). 

The Higgs, the universe and everything

The “Mexican hat"

The Higgs boson is the excitation of a featureless condensate that fills all space – a complex scalar field with a shape resembling a Mexican hat. The universe is pictured as being born in a symmetric state at the top of the hat: the electromagnetic and weak forces were one, and particles moved at the speed of light. A fraction of a nanosecond later, the universe transitioned to a less symmetric but more stable configuration in the rim of the hat, giving the universe a vacuum expectation value of 246 GeV. 

During this electroweak symmetry- breaking process, three of the BEH field’s components were absorbed to generate polarisation states, and thus masses, for the W and Z bosons; the other component, corresponding to a degree of freedom “up and down” the rim of the hat, is the Higgs boson (see “Lifting the lid” image). The masses of the fermions are generated via Yukawa couplings to the BEH field, implying that mass is not an intrinsic property of elementary particles.

The roots of the BEH mechanism lie in the phenomenon of spontaneous symmetry breaking, which is inherent in superconductivity and superfluidity. In 1960, Yoichiro Nambu and then Jeffrey Goldstone introduced spontaneous symmetry breaking into particle physics, paving the way for taming the weak interaction using gauge theory, like electromagnetism before it. Four years later, Robert Brout and Franҫois Englert and, independently, Peter Higgs, showed that a mathematical obstacle called the Goldstone theorem, which implied the existence of unobserved massless particles, is a blessing rather than a curse for gauge theories: the degrees of freedom responsible for the troublesome massless states generate masses for the heavy gauge bosons that mediate the short-range weak interaction (see A triumph for theory).

LEP, along with the higher energy Tevatron collider at Fermilab, offered Higgs hunters their first serious chance of a sighting. Dedicated analysis groups formed in the  experiments. For a decade they saw nothing. Then, on 14 June 2000, LEP’s final year of scheduled running, ALEPH reported a Higgs candidate at around 114–115 GeV, followed soon by a second and third event. LEP was granted a one-month extension. On 16 October, L3 announced a candidate. By 3 November ALEPH had notched up a 2.9σ excess. A request to extend LEP by one year was made, but there was deadlock at CERN. Five days later, Director-General Luciano Maiani announced that LEP had closed for the last time, so as not to delay the LHC. In addition to determining the properties of the W and Z bosons in detail and confirming the existence of electroweak radiative corrections, LEP had planted a flag in energy below which the Higgs would not be found.

Muscling a discovery 

In 1977, CERN Director-General John Adams had the foresight to make the LEP tunnel large enough to accommodate a TeV hadron collider capable of probing the scale of electroweak symmetry breaking. Spurred on by the W and Z discoveries, finding or ruling out the Higgs boson became the central goal of the LHC, greatly influencing the designs of the ATLAS and CMS detectors during the 1990s. Tens of millions of people worldwide watched as the first proton beams were threaded through the machine on 10 September 2008. While the LHC had other goals, the quest for the Higgs boson and the origin of mass resonated with non-experts and brought particle physics to the world. 

It was a bumpy start (see The bumpy ride to the bump), but high-energy LHC data began to flood in on 10 March 2010. By the time of the European Physical Society high-energy physics conference in Grenoble in July 2011, ATLAS and CMS were ready to offer a peek of their results. Practically, the search for the Higgs came down to a process of excluding mass ranges in which no signal had been seen. ATLAS and CMS had shrunk the allowed range and found a number of events hinting at a Higgs boson with a mass of about 142 GeV. “We both saw a bump at the same place, and we had champagne after the talks,” recalls Kyle Cranmer, co-coordinator of the ATLAS Higgs combination group at the time. “We weren’t confident then, but we were optimistic.” Fermilab’s Tevatron collider was also sensitive to a Higgs in the upper mass range and its CDF and D0 experiments pioneered many of the analysis methods that were used in ATLAS and CMS. Just four years earlier, they had hinted at a possible signal at 160 GeV, only for it to disappear with further data. Was the US machine about to make a last-gasp discovery and scoop the LHC? 

2011 results from ATLAS and CMS

The media were hot on the sigma trail. On 13 December 2011, the LHC experiments updated their findings: ATLAS constrained the Higgs to lie in the range 116-130 GeV, and CMS to lie in the range 115-127 GeV. For some, a light Higgs boson was in the bag. Others were hesitant. “There was a three-sigma excess when combining all the channels, but there were also less significant excesses in other mass regions,” recalls Mariotti. “I maybe also wanted not to believe it, in order not to be biased when analysing the data in 2012. And maybe because somehow if the Higgs was not there, it would have been really thrilling, much more challenging for us all.”

The following year, with the LHC running at a slightly higher energy, the collaborations knew that they would soon be able to say something definitive about the low-mass excess of events. From that moment, CMS decided not to look at the data and instead to redesign its analyses on simulated events “blinded”. On the evening of 14 June, all the analysis groups met separately to “open the box”. The next day, they shared their results with the collaboration. The two-photon and four-lepton channels had a beautiful peak at the same place. “It was like a very strong punch in the stomach,” says Mariotti. “From that moment it was difficult to sleep, and it was hard not to smile!”

The quest for the Higgs boson and the origin of mass resonated with non-experts and brought particle physics to the world

Members of both collaborations were under strict internal embargoes concerning the details. ATLAS unblinded its di-photon results late on 31 May, revealing a roughly 2σ excess. By 19 June it had grown to 3.3σ. The four-lepton group saw a similar excess. “My student Sven Kreiss was the first person in ATLAS to combine the channels and see the curve cross the 5σ threshold,” says Cranmer. “That was on 24 June, and it started to sink in that we had really found it. But it was still not clear what we would claim or how we would phrase things.” Amazingly, he says, he was not aware of the CMS results. “I was also not going out of my way to find out. I was relishing the moment, the excitement, and the last days of uncertainty. I also had more important things to do in preparation for the talk.” 

With the rumour mill in overdrive, a seminar at CERN was called for 4 July, also the first day of the ICHEP conference in Melbourne. Peter Higgs and François Englert, and Carl Hagan and Gerald Guralnik (who, with Tom Kibble, also arrived at the mass-generating mechanism), were to be there. The collaborations were focused only on their presentations. It had to be a masterpiece, says Mariotti. The day before, the CMS and ATLAS Higgs conveners met for coffee. They revealed nothing. “It was really hard not to know. We knew we had it, but somehow if ATLAS did not have it or had it but at a different mass, it all would have been a big disillusion.”

ICHEP 2012 and François Englert with Peter Higgs

Many at CERN decided to spend the night of 3 July in front of the auditorium so as not to miss the historic moment. CMS spokesperson Joe Incandela was first to guide the audience through the checks and balances behind the final plots. Fabiola Gianotti followed for ATLAS. When it was clear that both had seen a 5σ excess of events at around 125 GeV, the room erupted. Was is it really the Higgs? All that was certain was that the particle was a boson, with a mass where the Standard Model expected it. Seizing the moment, and the microphone, Director-General Rolf Heuer announced: “As a layman, I would now say ‘I think we have it’, do you agree?” It was a spontaneous decision, he says. “For a short period between the unblindings and the seminar, I was one of the few people in the world, just with research director Sergio Bertolucci, in fact, who was aware of both results. We would not have announced a discovery had one experiment not come close to that threshold.”  

The summer of 2012 produced innumerable fantastic memories, says Marumi Kado, ATLAS Higgs-group co-convener at the time and now a deputy spokesperson. “The working spirit in the group was exceptional. Each unblinding, each combination of the channels was an incredible event. Of course, the 4 July seminar was among the greatest.” In CMS, says Mariotti, there was a “party-mood” for months. “Every person thought, correctly, that they had played a role in the discovery, which is important, otherwise very large experiments cannot be done.” 

The path from here 

Ten years later, ATLAS and CMS measurements have shown the Higgs boson to be consistent with the minimal version required by the Standard Model. Its couplings to the gauge bosons and the heaviest three fermions (top, bottom and tau) have been confirmed, evidence that it couples to a second-generation fermion (the muon) obtained, and first studies of Higgs–charm and Higgs–Higgs couplings reported (see The Higgs boson under the microscope). However, data from Run 3, the High-Luminosity LHC and a possible Higgs-factory to follow the LHC, are needed to fully test the Standard-Model BEH mechanism (see The Higgs after LHC). 

Every person thought, correctly, that they had played a role in the discovery, which is important, otherwise very large experiments cannot be done

Events on 4 July 2012 brought one scientific adventure to a close, but opened another, fascinating chapter in particle physics with fewer theoretical signposts. What is clear is that precision measurements of the Higgs boson open a new window to explore several pressing mysteries. The field from which the Higgs boson hails governs a critical phase transition that might be linked to the cosmic matter–antimatter asymmetry (see Electroweak baryogenesis); as an elementary scalar, it offers a unique “portal” to dark or hidden sectors which might include dark matter (see Through the Higgs portal); as the arbiter of mass, it could hold clues to the puzzling hierarchy of fermion masses (see The origin of particle masses); and its interactions govern the ultimate stability of the universe (see The Higgs and the fate of the universe). The very existence of a light Higgs boson in the absence of new particles to stabilise its mass is paradoxical (see Naturalness after the Higgs). Like the discovery of the accelerating universe, Nima Arkani-Hamed told the Courier in 2019, it is profoundly “new” physics: “Both discoveries are easily accommodated in our equations, but theoretical attempts to compute the vacuum energy and the scale of the Higgs mass pose gigantic, and perhaps interrelated, theoretical challenges. While we continue to scratch our heads as theorists, the most important path forward for experimentalists is completely clear: measure the hell out of these crazy phenomena!”

A triumph for theory

Increasingly complex electroweak processes

Often in physics, experimentalists observe phenomena that theorists had not been able to predict. When the muon was discovered, theoreticians were confused; a particle had been predicted, but not this one. Isidor Rabi came with his famous outcry: “who ordered that?” The J/ψ is another special case. A particle was discovered with properties so different from the particles that were expected, that the first guesses as to what it was were largely mistaken. Soon it became evident that it was a predicted particle after all, but it so happened that its features were more exotic than was foreseen. This was an experimental discovery requiring new twists in the theory, which we now understand very well. The Higgs particle also has a long and interesting history, but from my perspective, it was to become a triumph for theory. 

From the 1940s, long before any indications were seen in experiments, there were fundamental problems in all theories of the weak interaction. Then we learned from very detailed and beautiful measurements that the weak force seemed to have a vector-minus axial-vector (V-A) structure. This implied that, just as in Yukawa’s theory for the strong nuclear force, the weak force can also be seen as resulting from an exchange of particles. But here, these particles had to be the energy quanta of vector and axial-vector fields, so they must have spin one, with positive and negative parities mixed up. They also must be very heavy. This implied that, certainly in the 1960s, experiments would not be able to detect these intermediate particles directly. But in theory, we should be able to calculate accurately the effects of the weak interaction in terms of just a few parameters, as could be done with the electromagnetic force. 

Electromagnetism was known to be renormalisable – that is, by carefully redefining and rearranging the mass and interaction parameters, all observable effects would become calculable and predictable, avoiding meaningless infinities. But now we had a difficulty: the weak exchange particles differed from the electromagnetic ones (the photons) because they had mass. The mass was standing in the way when you tried to do what was well understood in electromagnetism. How exactly a correct formalism should be set up was not known, and the relationship between renormalisability and gauge invariance was not understood at all. Indeed, today we can say that the first hints were already there by 1954, when C N Yang and Robert Mills wrote a beautiful paper in which they generalised the principle of local gauge invariance to include gauge transformations that affect the nature of the particles involved. In its most basic form, their theory described photons with electric charge.

Thesis topic

In 1969 I began my graduate studies under the guidance of Martinus J G Veltman. He explained to me the problem he was working on: if photons were to have mass, then renormalisation would not work the same way. Specifically, the theory would fail to obey unitarity, a quantum mechanical rule that guarantees probabilities are conserved. I was given various options for my thesis topic, but they were not as fundamental as the issues he was investigating. “I want to work with you on the problem you are looking at now,” I said. Veltman replied that he had been working on his problem for almost a decade; I would need lots of time to learn about his results. “First, read this,” he said, and he gave me the Yang–Mills paper. “Why?” I asked. He said, “I don’t know, but it looks important.”

Making history

That, I could agree  with. This was a splendid idea. Why can’t you renormalise this? I had convinced myself that it should be possible, in principle. The Yang–Mills theo­­ry was a relativistic quantised field theory. But Veltman explained that, in such a theory, you must first learn what the Feynman rules are. These are the prescriptions that you have to follow to get the amplitudes generated by the theory. You can read off whether the amplitudes are unitary, obey dispersion relations, and check that everything works out as expected.

Many people thought that renormalisation – even quantum field theory – was suspect. They had difficulties following Veltman’s manipulations with Feynman diagrams, which required integrations that do not converge. To many investigators, he seemed to be sweeping the difficulties with the infinities under the rug. Nature must be more clever than this! Yang–Mills seemed to be a divine theory with little to do with reality, so physicists were trying all sorts of totally different approaches, such as S-matrix theory and Regge trajectories. Veltman decided to ignore all that.

Solid-state inspiration

Earlier in the decade, some investigators had been inspired by results from solid-state physics. Inside solids, vibrating atoms and electrons were described by nonrelativistic quantum field theories, and those were conceptually easier to understand. Philip Anderson had learned to understand the phenomenon of superconductivity as a process of spontaneous symmetry breaking; photons would obtain a mass, and this would lead to a remarkable rearrangement of the electrons as charge carriers that would no longer generate any resistance to electric currents. Several authors realised that this procedure might apply to the weak force. In the summer of 1964, Peter Higgs submitted a manuscript to Physical Review Letters, where he noted that the mechanism of making photons massive should also apply to relativistic particle systems. But there was a problem. Jeffrey Goldstone had sound mathematical arguments to expect the emergence of massless scalar particles as soon as a continuous symmetry breaks down spontaneously. Higgs put forward that this theorem should not apply to spontaneously broken local symmetries, but critics were unconvinced.

The journal sent Higgs’s manuscript out to be peer reviewed. The reviewer did not see what the paper would add to our understanding. “If this idea has anything to do with the real world, would there be any possibility to check it experimentally?” The correct question would have been what the paper would imply for the renormalisation procedure, but this question was in nobody’s mind. Anyway, Higgs gave a clear and accurate answer: “Yes, there is a consequence: this theory not only explains where the photon mass comes from, but it also predicts a new particle, a scalar particle (a particle with spin zero), which unlike all other particles, forms an incomplete representation of the local gauge symmetry.” In the meantime, other papers appeared about the photon mass-generation process, not only by François Englert and Robert Brout in Brussels, but also by Tom Kibble, Gerald Guralnik and Carl Hagen in London. And Sheldon Glashow, Abdus Salam and Steven Weinberg were formulating their first ideas (all independently) about using local gauge invariance to create models for the weak interaction. 

I started to study everything from the ground up

At the time spontaneous symmetry breaking was being incorporated into quantum field theory, the significance of renormalisation and the predicted scalar particles were hardly mentioned. Certainly, researchers were not able to predict the mass of such particles. Personally, although I had heard about these ideas, I also wasn’t sure I understood what they were saying. I had my own ways of learning how to understand things, so I started to study everything from the ground up. 

If you work with quantum mechanics, and you start from a relativistic classical field theory, to which you add the Copenhagen procedure to turn that into quantum mechanics, then you should get a unitary theory. The renormalisation procedure amounts to transforming all expressions that threaten to become infinite due to divergence of the integrals, to apply only to unobservable qualities of particles and fields, such as their “bare mass” and “bare charge”. If you understand how to get such things under control, then your theory should become a renormalised description of massive particles. But there were complications.

The infinities that require a renormalisation procedure to tame them originate from uncontrolled behaviour at very tiny distances, where the effective energies are large and consequently the effects of mass terms for the particles should become insignificant. This revealed that you first have to renormalise the theory without any masses in them, where also the spontaneous breakdown of the local symmetry becomes insignificant. You had to get the particle book-keeping right. A massless photon has only two observable field components (they can be left- or right-rotating), whereas a massive particle with the same spin can rotate in three different ways. One degree of freedom did not match. This was why an extra field was needed. If you wanted massive photons with electric charges +, 0 or –, you would need a scalar field with four components; one of these would represent the total field strength, and would behave as an extra, neutral, spin-0 particle – the observable particle that Higgs had talked about – but the others would turn the number of spinning degrees of freedom of the three other bosons from two to three each (see “Dynamical” figure).

One question

In 1970 Veltman sent me to a summer school organised by Maurice Lévy in a new science institute at Cargèse on the French island of Corsica. The subject would be the study of the Gell–Mann–Lévy model for pions and nucleons, in particular its renormalisation and the role of spontaneous symmetry breaking. Will renormalisation be possible in this model, and will it affect its symmetry? The model was very different from what I had just started to study: Yang–Mills theory with spontaneous breaking of its symmetry. There were quite a few reputable lecturers besides Lévy himself: Benjamin Lee and Kurt Symanzik had specialised in renormalisation. Shy as I was, I only asked one question to Lee, and the same to Symanzik: does your analysis apply to the Yang–Mills case?

Both gave me the same answer: if you are Veltman’s student, ask him. But I had, and Veltman did not believe that these topics were related. I thought that I had a better answer, and I fantasised that I was the only person on the planet who knew how to do it right. It was not obvious at all; I had two German roommates at the hotel where I had been put, who tried to convince me that renormalisation of Feynman graphs where lines cross each other would be unfathomably complicated.

Spin-1 particles

Veltman had not only set up detailed, fully running machinery to handle the renormalisation of all sorts of models, but he had also designed a futuristic computer program to do the enormous amount of algebra required to handle the numerous Feynman diagrams that appear to be relevant for even the most basic computations. I knew he had those programs ready and running. He was now busy with some final checks: if his present attempts to check the unitarity of his renormalised model still failed, we should seriously consider giving this up. Yang–Mills theories for the weak interactions would not work as required.

But Veltman had not thought of putting a spin-zero, neutral particle in his model, certainly not if it wasn’t even in a complete representation of the gauge symmetry. Why should anyone add that? After returning from Cargèse I went to lunch with Veltman, during which I tried to persuade him. Walking back to our institute, he finally said, “Now look, what I need is not an abstract mathematical idea, what I want is a model, with a Lagrangian, from which I can read off the Feynman diagrams to check it with my program…”. “But that Lagrangian I can give you,” I said. Next, he walked straight into a tree! A few days after I had given him the Lagrangian, he came to me, quite excited. “Something strange,” he said, “your theory isn’t right because it still isn’t unitary, but I see that at several places, if the numbers had been a trifle different, it could have worked out.” Had he copied those factors ¼ and ½ that I had in my Lagrangian, I wondered? I knew they looked odd, but they originated from the fact that the Higgs field has isospin ½ while all other fields have isospin one.

No, Veltman had thought that those factors came from a sloppy notation I must have been using. “Try again,” I asked. He did, and everything fell into place. Most of all, we had discovered something important. This was the beginning of an intensive but short collaboration. My first publication “Renormalization of massless Yang–Mills fields”, published in October 1971, concerned the renormalisation of the Yang–Mills theory without the mass terms. The second publication that year, “Renormalizable Lagrangians for massive Yang–Mills fields,” where it was explained how the masses had to be added, had a substantial impact. 

There was an important problem left wide open, however: even if you had the correct Feynman diagrams, the process of cancelling out the infinities could still leave finite, non-vanishing terms that ruin the whole idea. These so-called “anomalies” must also cancel out. We found a trick called dimensional renormalisation, which would guarantee that anomalies cancel except in the case where particles spin preferentially in one direction. Fortunately, as charged leptons tend to rotate in opposite directions compared to quarks, it was discovered that the effects of the quarks would cancel those of the leptons. 

The fourth component

Within only a few years, a complete picture of the fundamental interactions became visible, where experiment and theory showed a remarkable agreement. It was a fully renormalisable model where all quarks and all leptons were represented as “families” that were only complete if each quark species had a leptonic counterpart. There was an “electroweak force”, where electromagnetism and the weak force interfere to generate the force patterns observed in experiments, and the strong force was tamed at almost the same time. Thus the electroweak theory and quantum chromodynamics were joined into what is now known as the Standard Model.

Be patient, we are almost there, we have three of the four components of this particle’s field

This theory agreed beautifully with observations, but it did not predict the mass of the neutral, spin-0 Higgs particle. Much later, when the W and the Z bosons were well-established, the Higgs was still not detected. I tried to reassure my colleagues: be patient, we are almost there, we have three of the four components of this particle’s field. The fourth will come soon.

As the theoretical calculations and the experimental measurements became more accurate during the 1990s and 2000s, it became possible to derive the most likely mass value from indirect Higgs-particle effects that had been observed, such as those concerning the top-quark mass. On 4 July 2012 a new boson was directly detected close to where the Standard Model said the Higgs  would be. After these first experimental successes, it was of utmost importance to check whether this was really the object we had been expecting. This has kept experimentalists busy for the past 10 years, and will continue to do so for the foreseeable future. 

The discovery of the Higgs particle is a triumph for high technology and basic science, as well as accurate theoretical analyses. Efforts spanning more than half a century paid off in the summer of 2012, and a new era of understanding the particles, their masses and interactions began.

The origin of particle masses

For thousands of years, humans have asked “what are the building blocks of nature?” To those not familiar with the wonders of relativistic quantum mechanics, the question might seem equivalent to asking “what are the smallest particles known?” However, we know that the size of atoms is quantised, and has negligible dependence on the size of nuclei. In fact, atomic size is essentially inversely proportional to the mass of the electron. Therefore, it is the electron mass, in addition to the rules of quantum mechanics, that essentially controls all the inner structure of all the elements. Furthermore, the masses and sizes of nuclei, protons and neutrons cannot simply be obtained by “adding up” smaller degrees of freedom; they are rather dictated by the coupling constant of the strong force, which below a certain energy scale, ΛQCD, becomes so large that the force between two particles becomes approximately independent of their distance, inducing confinement.

The above description suggests that “all” that is required to understand the basic structure of matter is to understand the origin of the electron mass and to study quantum chromodynamics. But this misses the bigger picture revealed by the Standard Model (SM). Protons, neutrons and other light, long-lived baryons are the lightest excitations of the pion field, which is constructed from the ultra-light u and d quarks, and perhaps also s quarks. This reveals the profound importance of the values of the fermion masses: increasing the u and d mass difference by less than 10 MeV (that is, about 1% of the proton mass), for instance, would make hydrogen and its isotopes unstable, thereby preventing the formation of almost all the elements in the early universe. Indeed, there are only certain regions in the vast quark-mass and ΛQCD parameter space that enable the universe as we know it to form.

Artistic representation of the Higgs boson

Having established that the structure of the masses of the elementary particles is an existential issue, what does this have to do with the discovery of the Higgs boson? While the Higgs boson carries a cosmological background value called the vacuum expectation value (VEV), which is associated with the spontaneous breaking of the electroweak symmetry, the VEV is not necessarily the source of the actual value and/or the pattern of fermion masses. The reason is that, in addition to baryonic charge (or number), all the elementary charged particles carry “chiral charge” – they are either left- or right-handed – which is conserved in the absence of the Brout–Englert–Higgs (BEH) field. What is fascinating about the BEH mechanism is that with the appropriate choice of coupling, the product of the field and its coupling-strength to the fermions effectively becomes a source of chiral charge, allowing the fermions to interact with it; the VEV is merely the constant of proportionality that induces the masses of the fermions (and of the weak-force mediators). This is a very minimal setup! In other known symmetry-breaking frameworks – for instance models based on technicolour/QCD-like dynamics or on superconductivity, where the electromagnetic symmetry inside a superconductor is broken by a condensate of electrons denoted Cooper pairs – there is no direct link to the generation of fermion masses. 

Standard Model couplings

The BEH mechanism might be minimal, but it still involves many parameters. The origin of fundamental masses requires switching on nine trilinear-couplings, which are broken into three generations of fundamental particles: three involving the u-type left- and right-handed quarks (u, c, t), three involving the d-type left- and right-handed quarks (d, s, b) and three involving the left- and right-handed charged leptons (e, µ, τ). Each coupling is associated with a linear “Yukawa” coupling of the Higgs boson to fermions, which implies that all the charged fermions acquire a mass proportional to the VEV of the BEH field. In other words, there is a linear relation between the Yukawa coupling and the fermion masses. Strikingly, the observed fermion masses encoded in the Yukawa couplings span some five orders of magnitude, with all but some members of the third generation being extremely small – leading to the fermion mass-hierarchy puzzle.

Relationship between the fundamental masses and their Yukawa couplings to the BEH field

The coupling between the Higgs boson and the fermions can be pictured as a new force – one that is radically different to the SM gauge forces. Given that this force only works between two particles that are closer than around 10–18 m – i.e. 1000 times smaller than the proton radius – it is not relevant to any experimental setup. The Higgs–Yukawa couplings do, however, conceal two interesting aspects related to our existence. The first is that increasing the VEV by a few factors would increase the neutron–proton mass splitting to the point where all nuclei are unstable. The second, pointed out by Giuseppe Degrassi and co-workers in 2013, is that the top-quark Yukawa interaction is close to its maximal size: increasing it by as little as 10% would push the VEV to fantastically large values, rendering our current universe unstable (see The Higgs and the fate of the universe). 

Massive alternatives

The minimal BEH mechanism is not the only way to understand the fermion mass hierarchy. This is illustrated by two radically different options. In the first, proposed in 2008 by Gian Giudice and Oleg Lebedev, the Yukawa couplings are assumed to depend on the BEH field, therefore avoiding hierarchies in the Yukawa couplings. The idea postulates a variation of chiral symmetry (in which the lighter the fermion the more chiral charge it carries) that forbids lighter particles from coupling to the Higgs linearly, but instead generates their masses through appropriate powers of the VEV (see “In line” figure, blue curve). The other extreme possibility, discussed more recently by the present author and colleagues, is where the masses of the light fermions instead come from their interaction with a subdominant additional source of electroweak symmetry breaking, similar to the technicolour framework. This new source replaces the Higgs boson’s role as the carrier of the light-generation chiral-charge, causing the light fermion-Higgs couplings to vanish (see figure, red curve). Both cases lead to an alternative understanding of the mass hierarchy puzzle and to the establishment of new physics.

The conclusion is that measuring the fermion-Higgs couplings at higher levels of precision will significantly improve our understanding of the origin of masses in nature. It took a few years after the Higgs-boson discovery, around 2018, for ATLAS and CMS to establish that the standard BEH mechanism is behind the third-generation fermion masses. This is a legacy result from the LHC experiments that is sometimes overlooked by our community (CERN Courier September/October 2020 p41). While significant, however, it told us little about the origin of the matter in the universe, which is almost exclusively made out of first-generation fermions with extremely small couplings to the Higgs boson. So far, we only have indirect information, via Higgs-boson couplings to the gauge bosons, about the origin of mass of the first and second generations. But breakthroughs are imminent. In the past two years, ATLAS and CMS have found signs that the Higgs boson contributes to both the second-generation muon and charm masses, which would exclude models leading to both the blue and red curves in the figure. Measuring the smallest electron Yukawa coupling is only possible at a future collider, whereas for the u and d quarks there is no clear experimental pathway.

Experimental novelties

A recent, unexpected way to tackle the mystery of fermion masses involves dark matter, specifically a class of models in which the dark-matter particle is ultra-light and its field-value oscillates with time. Such particles would couple to fermions in a way that echoes the Higgs–Yukawa coupling, though with an extremely low interaction strength, and lead to a variation in the masses of the fundamental fermions with time. This feeble effect cannot be searched for at colliders, but it can be probed with quantum sensors such as atomic clocks or future nuclear clocks that reach sensitivity of one part in 1019 or more. The strongest sensitivity of these tabletop experiments is the one to the electron mass.

It is now a priority to directly test the mass-generating mechanism of the first two generations

The discovery of the Higgs boson has opened a new window on the origin of masses, and consequently the structure of the basic blocks of nature, with profound links to our existence. ATLAS and CMS have made several breakthroughs, including the observation that the third-generation masses originate from the SM minimal BEH mechanism, and also providing evidence for part of the second-generation fermions. It is now a priority to directly test the mass-generating mechanism of the first two generations, and to determine all the Higgs couplings at higher precision, in search of possible chinks in the SM armour. 

The Higgs and the fate of the universe

Transition after electroweak symmetry breaking

A vacuum is ordinarily pictured as an empty region containing no particles, atoms or molecules of matter, as in outer space. To a particle physicist, however, it is better defined as the lowest energy state that can be attained when no physical particles are present. Even in empty space there are fields that are invisible to the naked eye but nevertheless influence the behaviour of matter, while quantum mechanics ensures that, even if particles are not physically present, they continually fluctuate spontaneously in and out of existence. 

In the Standard Model (SM), in addition to the familiar gravitational and electromagnetic fields, there is the Brout–Englert–Higgs (BEH) field that is responsible for particle masses. It is usually supposed to have a constant value throughout the universe, namely the value that it takes at the bottom of its “Mexican hat” potential (see “New depths” figure). However, as was first pointed out by several groups in 1979, and revisited by many theorists subsequently, the shape of the Mexican hat is subject to quantum effects that change its shape. For example, the BEH field has self-interactions that tend to curl the brim of the hat upwards, but there are additional quantum effects that tend to curl the brim downwards, due to the interactions with the fundamental particles to which the BEH field gives mass. The most important of these is the heaviest matter particle: the top quark.

Push and pull

The upward push of the Higgs boson’s self-interaction and the downward pressure of the top quark are very sensitive to their masses, and also to the strong interactions, which modify the effect of the top quark. Experiments at the LHC have already determined the mass of the Higgs boson with a precision approaching 0.1%, and CMS recently measured the mass of the top quark with an accuracy of almost 0.2%, while the strong coupling strength is known to better than 1%. The latest calculations of the quantum effects of the Higgs boson and the top quark indicate that the brim of the Mexican hat turns down when the BEH field exceeds its value today by 10 orders of magnitude, implying that the current value is not the lowest energy and hence not the true vacuum of the SM. A consequence is that the current BEH value is not stable, because quantum fluctuations would inevitably cause it to decay into a lower-energy state. The universe as we know it would be doomed (see “On the cusp” figure).

However, there is no immediate need to panic. First, the universe is metastable with an estimated lifetime before it decays that is many, many orders of magnitude longer than its age so far. Second, one could perhaps cling to the increasingly forlorn hope that the prediction of a lower-energy state of the SM vacuum is somehow mistaken. Perhaps an experimental measurement going into the calculation has an unaccounted uncertainty, or perhaps there is some ingredient that is missing from the theoretical calculation of the shape of the Mexican hat? 

Absolute stability, metastability and instability of the SM vacuum

If you simply take the calculation at face value and humbly accept the eventual demise of the universe as we know it, however, a further problem arises. Since quantum and thermal fluctuations in the BEH field were probably much larger when the universe was young and much hotter than today, the overwhelming majority of the universe would have been driven into the lower-energy state. Only an infinitesimal fraction would be in the metastable state we find ourselves in today, where the value of the BEH field is relatively small. Of course, one could argue anthropically that this good luck was inevitable, as we could not live in any other “vacuum” state. 

To me, this argument reeks of special pleading. Instead, my instinct is to argue that some physics beyond the SM must appear below the turn-down scale and stabilise the vacuum that we live in. This argument is not specific about the type of new physics or the scale at which it appears. One extension of the SM that fits the bill is supersymmetry, but the stability argument offers no guarantee that this or any other extension of the SM is within reach of current experiments.

It used to be said that the nightmare scenario for the LHC would be to discover the Higgs boson and nothing else. However, the measured masses of the Higgs boson and the top quark may be hinting that there must be physics beyond the SM that stabilises the vacuum. Let us take heart from this argument, and keep looking for new physics, even if there is no guarantee of immediate success.

The Higgs boson under the microscope

On 4 July 2012, the ATLAS and CMS collaborations jointly announced their independent discoveries of a new particle directly related to the Brout–Englert–Higgs field that gives mass to all other particles in the Standard Model (SM). The LHC and its two general-purpose experiments were designed and built, among other things, with the aim of detecting or ruling out the SM Higgs boson. Within three years of the LHC startup, the two experiments detected a signal consistent with a Higgs boson with a mass of about 125 GeV, which was perfectly consistent with indications from precision measurements carried out at the electron–positron colliders LEP and SLC, and at the Tevatron proton–antiproton collider.

Higgs encounters

The discovery was made mainly by detecting decays of the new particle into two photons or two Z bosons (each of which decay into a pair of electrons or muons), for which the invariant mass can be reconstructed with high resolution. The search for the Higgs boson was also performed in other channels, and all results were found to be consistent with the SM expectations. A peculiar feature of the Higgs boson is that it has zero spin. At the time of the discovery, it was already excluded that the particle was a standard vector boson: a spin-1 particle cannot decay into two photons, leaving only spin-0 or spin-2 as the allowed possibilities. 

Ten years ago, the vast majority of high-energy physicists were convinced that a Higgs boson had been detected. The only remaining question was whether it was the boson predicted by the SM or part of an extended Higgs sector.

Basic identity  

The mass of the Higgs boson is the only parameter of the Higgs sector that is not predicted by the SM. A high-precision measurement of the mass is therefore crucial because, once it is known, all the couplings and production cross sections can be predicted in the SM and then compared with experimental measurements. The mass measurement is carried out using the H γγ and H  ZZ  4ℓ channels, with a combined ATLAS and CMS measurement based on Run 1 data obtaining a value of 125.09 ± 0.24 GeV. More precise results with a precision at the level of one part per thousand have been obtained by ATLAS and CMS using partial datasets from Run 2.

The width of the Higgs boson, unlike its mass, is well predicted at approximately 4 MeV. Since this is much smaller than the ATLAS and CMS detector resolutions, a precise direct measurement can only be carried out at future electron–positron colliders. At the LHC it is possible to indirectly constrain the width by studying the production of di-boson pairs (ZZ or WW) via the exchange of off-shell Higgs bosons: under some reasonable assumptions, the off-shell cross section at high mass relative to the on-shell cross section increases proportionally to the width. A recent result from CMS constrains the Higgs-boson width to be between 0.0061 and 2.0 times the SM prediction at 95% confidence level. Finding the width to be smaller than the SM would mean that some of the couplings are smaller than predicted, while a larger measured width could reflect additional decay channels beyond the SM, or a larger branching fraction of those predicted by the SM.

This is the first strong suggestion that the Higgs boson also couples to fermions from generations other than the third

The spin and charge-parity (CP) properties of the Higgs boson are other key quantities. The SM predicts that the Higgs boson is a scalar (spin-0 and positive CP) particle, but in extended Higgs models it could be a superposition of positive and negative CP states, for example. The spin and CP properties can be probed using angular distributions of the Higgs-boson decay products, and several decay channels were exploited by ATLAS and CMS: H γγ, ZZ, WW and ττ. All results to date indicate consistency with the SM and exclude most other models at more than 3σ confidence level, including all models with spin different from zero. 

Couplings to others 

One of the main tools for characterising the Higgs boson is the measurement of its production processes and decays. Thanks to growing datasets, improved analysis techniques, more accurate theoretical tools and better modeling of background processes, ATLAS and CMS have made remarkable progress in this crucial programme over the past decade. 

Using Run 1 data recorded between 2010 and 2012, the gluon-fusion and vector-boson fusion production processes were established, as were the decays to pairs of bosons (γγ, WW* and ZZ*) and to a τ-lepton pair from the combination of ATLAS and CMS data. With Run 2 data (2015–2018), both ATLAS and CMS observed the decay to a pair of b quarks. Although the preferred decay mode of the Higgs boson, this channel suffers from larger backgrounds and is mainly accessible in the associated production of the Higgs boson with a vector boson. The rarer production mode of the Higgs boson in association with a t-quark pair was also observed using a combination of different decay modes, providing a direct proof of the Yukawa coupling between the Higgs boson and top quark. The existence of the Yukawa couplings between the Higgs boson and third-generation fermions (t, b, τ) is thus established.

Mass spectra

The collaborations also investigated the coupling of the Higgs boson to the second-generation fermions, in particular the muon. With the full Run 2 dataset, CMS reported evidence at the level of 3σ over the background-only hypothesis that the Higgs boson decays into μ+μ, while ATLAS supported this finding with a 2σ excess. This is the first strong suggestion that the Higgs boson also couples to fermions from generations other than the third, again in accordance with the SM. Research is also ongoing to constrain the Higgs’s coupling to charm quarks via the decay H  cc. This is a much more difficult channel but, thanks to improved detectors and analysis methods, including extensive use of machine learning, ATLAS and CMS recently achieved a sensitivity beyond expectations and excluded a branching fraction of H  cc relative to the SM prediction larger than O(10). The possibility that the Higgs-boson’s coupling to charm is at least as large as the coupling to bottom quarks is excluded by a recent ATLAS analysis at 95% confidence level.

The accuracy of the production cross-section times decay branching-fraction measurements in the bosonic decay channel (diphoton, ZZ and WW) with the full Run 2 dataset is around 10%, allowing measurements in a more restricted kinematical region that can be sensitive to physics beyond the SM. In all probed phase-space regions, the measured cross sections are compatible with the SM expectations (Data used for some of the measurements are shown in the “Mass spectra” figure). 

Ten years after the discovery of a new elementary boson, considerable progress has been made toward understanding this particle

The combination of all measurements in the different production and decay processes can be used to further constrain the measured couplings between the Higgs boson and the other particles. The production cross section for vector-boson-fusion production, for example, is directly proportional to the square of the coupling strengths between the Higgs boson and W or Z bosons. A modification of these couplings will also affect the rate at which the Higgs boson decays to various final states. Assuming no contribution beyond the SM to Higgs decays and that only SM particles contribute to Higgs-boson vertices involving loops, couplings to t, b and τ are currently determined with uncertainties of around 10%, and couplings to W and Z bosons with uncertainties of about 5%. 

The relation between the mass of a particle and its coupling to the Higgs boson is as expected from the SM, in which the particle masses originate from their coupling to the Brout–Englert–Higgs field (see “Couplings” figure). These measurements thus set bounds on specific new-physics models that predict deviations of the Higgs-boson couplings from the SM. The impact of new physics at a high energy scale is also probed in effective-field-theory frameworks, introducing all possible operators that describe couplings of the Higgs boson to SM particles. No deviations from predictions are observed. 

New physics 

The Higgs boson is the only elementary particle with spin-0. However, an extended Higgs sector is a minimal extension of the SM and is predicted by many theories, such as those based on supersymmetry. These extensions predict several neutral or charged spin-0 particles: one is the observed 125 GeV Higgs boson; the others would preferentially couple to heavier SM particles. Searches for heavier scalar (or pseudo-scalar) particles have been carried out in a variety of final states, but no evidence for such particles is found. For example, the search for heavy scalar or pseudo-scalar particles decaying to a pair of τ leptons excludes masses up to 1–1.5 TeV. The extended Higgs sector can also include lighter scalar or pseudo-scalar particles into which the observed Higgs boson could decay. A wide range of final states have been investigated but no evidence found, setting stringent constraints on the corresponding Higgs-boson decay branching fractions.

Couplings

The Higgs sector could also play a role linking the SM to new physics that explains the presence of dark matter in the form of new neutral, weakly interacting particles. If their mass is less than half that of the Higgs boson, the Higgs boson could decay to a pair of these neutral particles. Since the particles would be invisible in the detector, this process can be detected by observing the presence of missing transverse momentum from the Higgs-boson recoiling against visible particles. The most sensitive processes are those in which the Higgs boson is produced in association to other particles: vector boson fusion, and the associated productions with a vector boson or with a top quark pair. No evidence of such decay has been found, setting upper limits on the invisible decay branching fraction of the Higgs boson at the level of 10%, and providing complementary constraints to those from direct dark-matter detection experiments.

Self-interaction

In addition to its couplings to other bosons and to fermions, the structure of the Brout–Englert–Higgs potential predicts a self-coupling of the Higgs boson that is related to electroweak symmetry breaking (see Electroweak baryogenesis). By studying Higgs-boson pair production at the LHC, it is possible to directly probe this self-coupling. 

The two main challenges of this measurement are the tiny cross section for Higgs-boson pair production (about 1000 times smaller than the production of a single Higgs boson) and the interference between processes that involve the self-coupling and those that do not. Final states with a favourable combination of the expected signal yield and signal-over-background ratio are exploited. The most sensitive channels are those with one Higgs boson decaying to a b-quark pair and the other decaying either to a pair of photons, τ leptons or b quarks. Upper limits of approximately three times the predicted cross section have been obtained with the Run 2 dataset. These searches can also be used to set constraints on the Higgs boson self-coupling relative to its SM value. 

The sensitivities achieved for Higgs-boson pair production searches with the Run 2 dataset are significantly better than expected before the start of Run 2, thanks to several improvements in object reconstruction and analysis techniques. These searches are mostly limited by the size of the dataset and thus will improve further with the Run 3 and much larger High-Luminosity LHC (HL-LHC) datasets.

Going further

Ten years after the discovery of a new elementary boson, considerable progress has been made toward understanding this particle. All measurements so far point to properties that are very consistent with the SM Higgs boson. All main production and decay modes have been observed by ATLAS and CMS, and the couplings to vector bosons and third-generation fermions are probed with 5 to 10% accuracy, confirming the pattern expected from the Brout–Englert–Higgs mechanism for electroweak symmetry breaking and the generation of the masses of elementary particles. Still, there is ample room for improvement in the forthcoming Run 3 and HL-LHC phases, to reduce the uncertainty in the coupling measurements down to a few per cent, to establish couplings to second-generation fermions (muons) and to investigate the Higgs-boson self-coupling. Improved measurements will also significantly expand the sensitivity to a possible extended Higgs sector or new dark sector. 

To reach the ultimate accuracy in the measurements of all Higgs-boson properties (including its self-coupling), to remove the assumptions in the determination of the Higgs couplings at the LHC, and to considerably extend the search for new physics in the Higgs sector, new colliders – such as an e+e collider and a future hadron collider – will be required.

Naturalness after the Higgs

Artwork from Peter Higgs’ Nobel diploma

When Victor Weisskopf sat down in the early 1930s to compute the energy of a solitary electron, he had no way of knowing that he’d ultimately discover what is now known as the electroweak hierarchy problem. Revisiting a familiar puzzle from classical electrodynamics – that the energy stored in an electron’s own electric field diverges as the radius of the electron is taken to zero (equivalently, as the energy cutoff of the theory is taken to infinity) – in Dirac’s recently proposed theory of relativistic quantum mechanics, he made a remarkable discovery: the contribution from a new particle in Dirac’s theory, the positron, cancelled the divergence from the electron itself and left a quantum correction to the self-energy that was only logarithmically sensitive to the cutoff. 

The same cancellation occurred in any theory of charged fermions. But when Weisskopf considered the case for charged scalar particles in 1939, the problem returned. To avoid the need for finely-tuned cancellations between this quantum correction and other contributions to a scalar’s self-energy, he posited that the cutoff energy for scalars should be close to their observed self-energy, heralding the appearance of new features that would change the calculation and render the outcome “natural”. 

Nearly 30 years would pass before Weisskopf’s prediction about scalars was put to the test. The charged pion, a pseudoscalar, suffered the very same divergent self-energy that he had computed. As the neutral pion is free from this divergence, Weisskopf’s logic suggested that the theory of charged and neutral pions should change at around 800 MeV, the cutoff scale suggested by the observed difference in their self-energies. Lo and behold, the rho meson appeared at 775 MeV. Repeating the self-energy calculation with the rho meson included, the divergence in the charged pion’s self-energy disappeared. 

This same logic would predict something new. It had been known for some time that the relative self-energy between the neutral kaons KL and KS diverged due to contributions from the weak interactions in a theory containing only the known up, down and strange quarks. Matching the observed difference suggested that the theory should change at around 3 GeV. Repeating the calculation with the addition of the recently proposed charm quark in 1974, Mary K Gaillard and Ben Lee discovered that the self-energy difference became finite, which allowed them to predict that the charm quark should lie below 1.5 GeV. The discovery at 1.2 GeV later that year promoted Weisskopf’s reasoning from an encouraging consistency check to a means of predicting new physics.

Higgs, we have a problem

Around the same time, Ken Wilson recognised that the coupling between the Higgs boson and other particles of the Standard Model (SM) leads to yet another divergent self-energy, for which the logic of naturalness implied new physics at around the TeV scale. Thus the electroweak hierarchy problem was born – not as a new puzzle unique to the Higgs, but rather the latest application of Weisskopf’s wildly successful logic (albeit one for which the answer is not yet known). 

History suggested two possibilities. As a scalar, the Higgs could only benefit from the sort of cancellation observed among fermions if there is a symmetry relating bosons and fermions, namely supersymmetry. Alternatively, it could be a light product of compositeness, just as the pions and kaons are light bound states of the strong interactions. These solutions to the hierarchy problem came to dominate expectations for physics beyond the SM, with a sharp target – the TeV scale – motivating successive generations of collider experiments. Indeed, when the physics case for the LHC was first developed in the mid-1980s, it was thought that new particles associated with supersymmetry or compositeness would be much easier to discover than the Higgs itself. But while the Higgs was discovered, no signs of supersymmetry or compositeness were to be found.

In the meantime, other naturalness problems were brewing. The vacuum energy – Einstein’s infamous cosmological constant – suffers a divergence of its own, and even the finite contributions from the SM are many orders of magnitude larger than the observed value. Although natural expectations for the cosmological constant fail, an entirely different set of logic seems to succeed in its place. To observe a small cosmological constant requires observers, and observers can presumably arise only if gravitationally-bound structures are able to form. As Steven Weinberg and others observed in the 1980s, such anthropic reasoning leads to a prediction that is remarkably close to the value ultimately measured in 1998. To have predictive power, this requires a multitude of possible universes across which the cosmological constant varies; only the ones with sufficiently small values of the cosmological constant produce observers to bear witness.

The electroweak hierarchy problem

An analogous argument might apply to the electroweak hierarchy problem: the nuclear binding energy is no longer sufficient to stabilise the neutron within typical nuclei if the Higgs vacuum expectation value (VEV) is increased well above its observed value. If the Higgs VEV varies across a landscape of possible universes while its couplings to fermions are kept fixed, only universes with sufficiently small values of the Higgs VEV would lead to complex atoms and, presumably, observers. Although anthropic reasoning for the hierarchy problem requires stronger assumptions than for the cosmological-constant problem, its compatibility with null results at the LHC is enough to raise questions about the robustness of natural reasoning. 

Amidst all of this, another proposed scalar particle entered the picture. The observed homogeneity and isotropy of the universe point to a period of exponential expansion of spacetime in the early universe driven by the inflaton. While the inflaton may avoid naturalness problems of its own, the expansion of spacetime and the quantum fluctuations of fields during inflation lead to qualitatively new effects that are driving new approaches to the hierarchy problem at the intersection of particle physics, cosmology and gravitation.

Perhaps the most prominent of these new approaches came, surprisingly enough, from a failed solution to the cosmological constant problem. Around the same time as the first anthropic arguments for the cosmological constant were taking form, Laurence Abbott proposed to “relax” the cosmological constant from a naturally large value by the evolution of a scalar field in the early universe. Abbot envisioned the scalar evolving along a sloping, bumpy potential, much like a marble rolling down a wavy marble run. As it did so, this scalar would decrease the total value of the cosmological constant until it reached the last bump before the cosmological constant turned negative. Although the universe would crunch away into nothingness if the scalar evolved to negative values of the cosmological constant, it could remain poised at the last bump for far longer than the age of the observed universe. 

Despite the many differences among the new approaches, they share a common tendency to leave imprints on the Higgs boson

While this fails for the cosmological constant (the resulting metastable universe is largely devoid of matter), analogous logic succeeds for the hierarchy problem. As Peter Graham, David Kaplan and Surjeet Rajendran pointed out in 2015, a scalar evolving down a potential in the early universe can also be used to relax the Higgs mass from naturally large values. Of course, it needs to stop close to the observed mass. But something interesting happens when the Higgs mass-squared passes from positive values to negative values: the Higgs acquires a VEV, which gives mass to quarks, which induces bumps in the potential of a particular type of scalar known as an axion (proposed to explain the unreasonably good conservation of CP symmetry by the strong interactions). So if the relaxing scalar is like an axion – a relaxion, you might say – then it will encounter bumps in its potential when it relaxes the Higgs mass to small values. If the relaxion is rolling during an inflationary period, the expansion of spacetime can provide the “friction” necessary for the relaxion to stop when it hits these bumps and set the observed value of the weak scale. The effective coupling between the relaxion and the Higgs that induces bumps in the relaxion potential is large enough to generate a variety of experimental signals associated with a new, light scalar particle that mixes with the Higgs.

The success of the relaxion hypothesis in solving the hierarchy problem hinges on an array of other questions involving gravity. Whether the relaxion potential can remain sufficiently smooth over the vast trans-Planckian distances in field space required to set the value of the weak scale is an open question, one that is intimately connected to the fate of global symmetries in a theory of quantum gravity (itself the target of active study in what is known as the Swampland programme).

Models abound 

In the meantime, the recognition that cosmology might play a role in solving the hierarchy problem has given rise to a plethora of new ideas. For instance, in Raffaele D’Agnolo and Daniele Teresi’s recent paradigm of “sliding naturalness”, the Higgs is coupled to a new scalar whose potential features two minima. In the true minimum, the cosmological constant is large and negative, and the universe would crunch away into oblivion if it ended up in this vacuum. In the second, local minimum, the cosmological constant is safely positive (and can be made compatible with the small observed value of the cosmological constant by Weinberg’s anthropic selection). The Higgs couples to this scalar in such a way that a large value of the Higgs VEV destabilises the “safe” minimum. During the inflationary epoch, only universes with suitably small values of the Higgs VEV can grow and expand, while those with large values of the Higgs VEV crunch away. A second scalar coupled analogously to the Higgs can explain why the VEV is small but non-zero. Depending on how these scalars are coupled to the Higgs, experimental signatures range from the same sort of axion-like signals arising from the relaxion, to extra Higgs bosons at the LHC.

Alternatively, in the paradigm of “Nnaturalness” proposed by Nima Arkani-Hamed and others, the multitude of SMs over which the Higgs mass varies occur in one universe, rather than many. The fact that the universe is predominantly composed of one copy of the SM with a small Higgs mass can be explained if inflation ends and reheats the universe through the decay of a single particle. If this particle is sufficiently light, it will preferentially reheat the copy of the SM with the smallest non-zero value of the Higgs VEV, even if it couples symmetrically to each copy. The sub-dominant energy density deposited in other copies of the SM leaves its mark in the form of dark radiation susceptible to detection by the Simons Observatory or upcoming CMB-S4 facility. 

Finally, Gian Giudice, Matthew Mccullough and Tevong You have recently shown that inflation can help to understand the electroweak hierarchy problem by analogy with self-organised criticality. Just as adding individual grains of sand to a sandpile induces avalanches over diverse length scales – a hallmark of critical behaviour, obtained without tuning parameters – so too can inflation drive scalar fields close to critical points in their potential. This may help to understand why the observed Higgs mass lies so close to the boundary between the unbroken and broken phases of electroweak symmetry without fine tuning.

Going the distance 

Underlying Weisskopf’s natural reasoning is a long-standing assumption about relativistic theories of quantum mechanics: physics at short distances (the ultraviolet, or UV) is decoupled from physics at long distances (the infrared, or IR), making it challenging to apply a theory involving a large energy scale to a much smaller one without fine tuning. This suggests that loopholes may be found in theories that mix the UV and the IR, as is known to occur in quantum gravity. 

While the connection between this type of UV/IR mixing and the mass of the Higgs remains tenuous, there are encouraging signs of progress. For instance, Panagiotis Charalambous, Sergei Dubovsky and Mikhail Ivanov recently used it to solve a naturalness problem involving so-called “Love numbers” that characterise the tidal response of black holes. The surprising influence of quantum gravity on the parameter space of effective field theories implied by the Swampland programme also has a flavour of UV/IR mixing to it. And UV/IR mixing may even provide a new way to understand the apparent violation of naturalness by the cosmological constant.

We have come a long way since Weisskopf first set out to understand the self-energy of the electron. The electroweak hierarchy problem is not the first of its kind, but rather the one that remains unresolved. The absence of supersymmetry or compositeness at the TeV scale beckons us to search for new solutions to the hierarchy problem, rather than turning our backs on it. In the decade since the discovery of the Higgs, this search has given rise to a plethora of novel approaches, building new bridges between particle physics, cosmology and gravity along the way. Despite the many differences among these new approaches, they share a common tendency to leave imprints on the Higgs boson. And so, as ever, we must look to experiment to show the way. 

 

bright-rec iop pub iop-science physcis connect