The Swampland program gives general constraints on effective theories to be compatible with quantum gravity, therefore defining the Landscape of consistent theories. The program is quickly gaining command of the fundamental understanding of open questions in particle physics and cosmology, ranging from the hierarchy of fundamental scales in nature, to the origin and final fate of the universe.
The aim of this workshop is to gather leading experts in the field, to discuss the recent developments in our understanding of the Swampland and its implications for cosmology and particle physics.
This meeting, normally held every three years, is intended to promote fruitful collaboration between experimentalists and theorists, between physicists in the areas of:
Searches for New Physics including the Dark Sector
Phenomenology of Physics Beyond the Standard Model
Beauty and Charm physics
Kaon physics
Tau and Muon physics
Neutrino physics
CP violation
Rare decays
Future facilities
from institutions across the world, by bringing together a limited number of particle physicists in beautiful and inspiring surroundings. A particular emphasis will be made on searches for new physics which complement direct studies at the LHC.
Expert speakers who will introduce the different areas include:
Jorge Camalich (Instituto de Astrofísica de Canarias)
Mu-Chun Chen (UC Irvine)
Giancarlo D’Ambrosio (INFN Naples)
Javier Fuentes (Universidad de Granada)
Stefania Gori (UC Santa Cruz)
Shoji Hashimoto (KEK)
Mikolaj Misiak (University of Warsaw)
Phillip Urquijo (University of Melbourne)
Roman Zwicky (University of Edinburgh)
In addition to having plenty of discussion, we will be producing a jointly-authored paper together during the conference which summarises these and the state of the field. We are looking forward to the participation of those who will be able to actively engage in the discussions and paper-writing sessions, and in turn believe that this will make for a rewarding and fruitful experience for all.
The workshop focuses on the physics at the low energy, high precision frontier without neglecting complementary approaches. It aims at highlighting present activities and future developments.
Scientific Topics
Low energy precision tests of the Standard Model
Experiments with muons, pions, neutrons, antiprotons, other particles and atoms
Searches for permanent electric dipole moments
Searches for symmetry violations and new forces
Precision measurements of fundamental constants
Exotic atoms and molecules
New tools and facilities
The Paul Scherrer Institut (PSI) itself offers unique opportunities for experiments in this realm: it houses the world’s most powerful proton cyclotron and the highest intensity low momentum pion and muon beams and the new ultracold neutron source.
The 9th International Conference on Quarks and Nuclear Physics will be held at FSU in Tallahassee, FL, USA, on 5-9 September, 2022. This conference follows the series of meetings previously held in Adelaide, Julich, Bloomington, Madrid, Beijing, Palaiseau, Valparaiso, and Tsukuba. Experimentalists and theorists will discuss recent developments in the field of hadron and nuclear physics.
To explain the large matter–antimatter asymmetry in the universe, the laws of nature need to be asymmetric under a combination of charge-conjugation (C) and parity (P) transformations. The Standard Model (SM) provides a mechanism for CP violation, but it is insufficient to explain the observed baryon asymmetry in the universe. Thus, searching for new sources of CP violation is important.
The non-invariance of the fundamental forces under CP can lead to different rates between a particle and an antiparticle decay. The CP violation in the decay of a particle is quantified through the parameter ACP, equal to the relative difference between the decay rate of the process and the decay rate of the CP-conjugated process. Three years ago, the LHCb collaboration reported the first observation of CP violation in the decay of charmed hadrons by measuring the difference between the time-integrated ACP in D0 →K–K+ and D0 →π– π+ decays, ΔACP. This difference was found to lie at the upper end of the SM expectation, prompting renewed interest in the charm-physics community. There is now an ongoing effort to understand whether this signal is consistent with the SM or a sign of new physics.
At the 41st ICHEP conference in Bologna on 7 July, the LHCb collaboration announced a new measurement of the individual time-integrated CP asymmetry in the D0 →K–K+ decay using the data sample collected during LHC Run 2. The measured value, ACP(K–K+)=[6.8±5.4(stat)±1.6(syst)]×10–4, is almost three times more precise than the previous LHCb determination obtained with Run 1 data. This was thanks not only to a larger data sample but also the inclusion of additional control channels Ds+ →K– K+ π+ and Ds+ →Ks0 K+. Together with the previous control channels, D+ →K– π+ π+ and D+ →Ks0 π+, these decays allow the separation between tiny signals of CP asymmetries from the much larger bias due to the asymmetric meson production and instrumental effects.
The combination of the measured values with the previously obtained ones of ACP(K–K+) and ΔACP by LHCb allowed the determination of the direct CP asymmetries in the D0 →π– π+ and D0 →K– K+ decays: [23.2±6.1]×10–4 and [7.7±5.7]×10–4, respectively, with correlated uncertainties (ρ=0.88). This is the first evidence of direct CP violation in an individual charm–hadron decay (D0→π– π+), with a significance of 3.8σ.
The sum of the two direct asymmetries, which is expected to be equal to 0 in the limit of s–d quark symmetry (called U-spin symmetry), is equal to [30.8±11.4]×10–4. This corresponds to a departure from U-spin symmetry of 2.7σ. In addition, this result is essential to the theory community in the quest to clarify the theoretical picture of CP-violation in the charm system. Since the measurement is statistically limited, its precision will improve with the larger dataset collected during Run 3.
The cryogenic infrastructure of the Large Hadron Collider (LHC) at CERN is the most complex helium refrigeration system of all the world’s research facilities.
The operation of the LHC’s cryogenic system was initiated in 2008 after reception testing and a first cool down to 1.9 K. This webinar will cover information on the design, operational experiences and main challenges linked to the accelerator, along with the physics requirements.
During the first stage, the operation team had to learn about the responsivity and limitations of the system. They then had to manage stable operation by maintaining the necessary conditions for the superconducting magnets, RF cavities, electrical feed boxes, power links and detector devices, thus contributing to the physics programme and the discovery of the Higgs boson in 2012.
One of the most challenging parameters impacting the cryogenics was the beam-induced heat load that was taken up, beginning during the second operation period (Run 2) of the LHC in 2015 with increased beam parameters. A complicated optimisation of the configuration of the cryogenic system was successfully applied to cope with these requirements.
Run 3 (preparation for which started in 2020) required the handling of several hundred magnet training quenches towards the nominal beam energy for physics production.
Now, after several years of operational experience with steady state and transient handling, the cryogenic system is being optimised to provide the necessary refrigeration, whilst incorporating the all-important aspect of energy preservation.
In conclusion, there will be a brief discussion of the next four years of operation.
Krzysztof Brodzinski is a senior staff member in the cryogenics group at the technology department at CERN. He is a mechanical engineer with a specialisation in refrigeration equipment, and graduated from Cracow University of Technology in Poland. Krzysztof joined the LHC cryogenic design team in 2001, has been a member of the cryogenic operation team since 2009 and in 2019 was mandated as a section leader of the cryogenic operation team for the LHC, ATLAS and CMS. He is also involved in the engineering of the cryogenic system for the HiLumi LHC RF deflecting cavities project, as well as participating in the ongoing FCC cryogenics study.
Why not watch our other related Higgs boson anniversary webinars?
Ten years ago, a few small bumps in ATLAS and CMS data confirmed a 48 year-old theoretical prediction, and particle physics hasn’t been the same since. Behind those sigmas was the hard work, dedication, competence and team spirit of thousands of experimentalists and accelerator physicists worldwide. Naturally it was a triumph for theory, too. Peter Higgs, François Englert, Carl Hagen and Gerald Guralnik received a standing ovation in the CERN auditorium on 4 July 2012, although Higgs insisted it was a day to celebrate experiment, not theory. The Nobel prize for Englert and Higgs came a year later. Straying from tradition for elementary-particle discoveries, the citation explicitly acknowledged the experimental effort of ATLAS and CMS, the LHC and CERN.
The implications of the Higgs-boson discovery are still being understood. Ten years of precision measurements have shown the particle to be consistent with the minimal version required by the Standard Model. Combined with the no-show of non-Standard Model particles that were expected to accompany the Higgs, theorists are left scratching their heads. As we celebrate the collective effort of high-energy physicists in discovering the Higgs boson and determining its properties, another intriguing journey has opened up.
Marvelously mysterious
As “a fragment of vacuum” with the barest of quantum numbers, the Higgs boson is potentially connected to many open questions in fundamental physics. The field from which it hails governs the nature of the electroweak phase transition in the early universe, which might be connected with the observed matter–antimatter asymmetry; as the only known elementary scalar particle, it could serve as a portal to other, hidden sectors relevant to dark matter; its couplings to matter particles — representing a new interaction in nature — may hold clues to the puzzling hierarchy of fermion masses; and its interactions with itself have implications for the ultimate stability of the universe.
Nobody knows what the Higgs boson has in store
With the LHC and its high-luminosity upgrade, physicists have 20 years of Higgs exploration to look forward to. But to fully understand the shape of the Brout–Englert–Higgs potential, the couplings of the Higgs boson to Standard Model particles and its possible connections to new physics, a successor collider will be needed. It is fascinating to picture future generations of particle physicists working as one with astroparticle physicists, cosmologists, quantum technologists and others to fill out the details of this potential new vista, with colliders driving progress alongside astrophysical, cosmological and gravitational-wave observatories. Future colliders aren’t just about generating knowledge, argues Anna Panagopoulou of the European Commission, but are “moonshots” delivering a competitive edge in technology, innovation, education and training — opening adventures that inspire young people to enter science in the first place.
Nobody knows what the Higgs boson has in store. Perhaps further studies will confirm the scenario of a Standard-Model Higgs and nothing else. The sheer number and profundity of known unknowns in the universe would suggest otherwise, think theorists. The good news is that, in the Higgs boson, physicists have clear measurement targets – and in principle the necessary theoretical and experimental machinery – to explore such mysteries, building upon the events of 4 July 2012 to reach the next level of understanding in fundamental physics.
Many of the most arbitrary aspects of the Standard Model of particle physics (SM) are intimately connected to the scalar sector of the theory. The SM comprises just one scalar particle, the Higgs boson, and assumes a specific scalar potential (the famous “Mexican hat”) to define the dynamics of electroweak (EW) interactions. But the fact that the Higgs boson acquires a non-zero vacuum expectation value that defines the mass scale of EW interactions (around 100–200 GeV) is assumed, not explained, by the SM. Indeed, why the Higgs-boson mass is constrained to be at the EW scale, while quantum corrections should push it to much higher values (the so-called naturalness problem, see Naturalness after the Higgs), is not justified by any symmetry of the SM. At the same time, the SM assumes that fermion masses are generated via arbitrary Yukawa-type interactions with the scalar field but it does not explain the hierarchy of couplings or masses that we observe, nor the specific flavour structure that arises from the presence of just one scalar field.
Future colliders are vital to push the precision Higgs programme to the next level
The scalar sector of the SM may therefore be seen as a messenger of a more fundamental theory that replaces the SM at energies beyond the EW scale and turns apparent arbitrariness into logical consequences. After all, the mechanism of EW symmetry breaking as realised in the SM via the Brout–Englert–Higgs (BEH) field is just the simplest possible way to generate massive EW gauge bosons and fermions while preserving gauge symmetry. The scalar potential could be more complicated, for example involving multiple scalar fields, as is common in many beyond-the-SM (BSM) theories. This would result in a richer pattern of stable and metastable minima and influence the nature of the EW phase transition. A first-order phase transition, together with extra sources of CP violation beyond what is implied by the SM, could explain the origin of the matter–antimatter asymmetry of the universe via EW baryogenesis (see Electroweak baryogenesis). Understanding the origin of the EW scale is thus key to connecting very different realms of particle physics and cosmology, and the question we face while we look into the future of collider physics.
Game changer
The discovery of the Higgs boson during Run 1 of the LHC has been a game changer in the exploration of new physics beyond the EW scale. The measurement of the Higgs-boson mass has added the last missing input parameter to precision global fits of the SM, which now provide a very powerful tool to constrain BSM scenarios. Thanks to an unprecedented level of precision reached in both theory and experiment, the measurement of Higgs-boson couplings to EW gauge bosons (W, Z) and to the first two generations of quarks and leptons (t, b, τ, µ) from Run 2 data has already constrained their deviations from SM expectations to within 5–20%, with the best accuracy reached for the couplings to the gauge bosons. Based on these results, the High-Luminosity LHC (HL-LHC) is projected to constrain the effects of new physics on Higgs-boson couplings to EW gauge bosons to 1–2%, and to heavy quarks and fermions to 3–5%. If no anomalies are found, this level of accuracy will push the lower bound on the scale of new physics into the TeV ballpark. Vice versa, the detection of possible anomalies may point to the presence of new physics at the TeV scale, possibly just around the corner.
On the other hand, testing the SM scalar potential will still be challenging even during the HL-LHC era. The shape of the BEH potential can be tested by measuring the Higgs-boson self-interactions corresponding to its cubic and quartic terms. In the SM, these interactions are strictly proportional to the Higgs-boson mass via the vacuum expectation value of the BEH field. Deviations from the SM are searched for via Higgs pair production and radiative corrections to single-Higgs measurements. Although the LHC and HL-LHC promise to provide evidence for di-Higgs production, the extraction of the Higgs self-coupling from such measurements will be statistically limited.
Future colliders are vital to push the precision Higgs programme to the next level. While the type and concept of the next collider is yet to be decided, all proposed facilities would deliver a huge number of Higgs bosons over their lifetime, operating at different and well targeted centre-of-mass energies (see “At a glance” figure). They can complement one another and, staggered over a period of the next few decades, provide the missing elements of the EW puzzle.
Among future lepton colliders under study, circular e+e– colliders (CEPC, FCC-ee) are expected to operate at lower energies between 90–350 GeV with very high luminosities, while linear e+e– colliders (ILC, C3, CLIC) offer both low- and high-energy phases generally with slightly lower luminosities. Combined with data from the HL-LHC, these “Higgs factories” would enable the SM, including most Higgs couplings, to be stress-tested below the per-cent level and in cases at or below the per-mille level. In particular, FCC-ee operating at the s-channel Higgs resonance (125 GeV) has the capability to provide bounds on couplings as small as the electron Yukawa coupling, while linear e+e– colliders operating at 550–600 GeV and above could substantially improve on the top-quark Yukawa coupling with respect to the HL-LHC. A possible muon collider, operated either as a Higgs factory at 125 GeV or as a high-energy discovery machine at 3–10 TeV, is estimated to reach similar precisions on Higgs couplings to other particles as e+e– machines.
Finally, high-energy lepton colliders (ILC 1000, CLIC 3000 and a 3–30 TeV muon collider) and very high-energy hadron colliders (FCC-hh at 100 TeV) would reach enough statistics and energy to measure the Higgs self-coupling and investigate the nature of the BEH potential, either via di-Higgs or single-Higgs production (see “Self-coupling” figure). With an aggressive Higgs physics programme they may also reach enough sensitivity to probe the cubic and quartic terms in the BEH potential separately.
Almost half a century after it was predicted, the LHC delivered the Higgs boson in spectacular style on 4 July 2012. Over the next 15–20 years, the machine and its luminosity upgrade will continue to enable ATLAS and CMS to make great strides in understanding the Higgs boson’s properties. But to fully exploit the discovery of the Higgs boson and explore its mysterious relation to new physics beyond the EW scale, we will need a successor collider.
Referring to the field equation of general relativity Rμν – ½ Rgμν = κTμν , Einstein is reported to have said that the left-hand side, constructed from space–time curvature, is “a palace of gold”; while the right-hand side, which parameterises the energy and momentum of matter, is by comparison “a hovel of wood”. Present-day physics has arrived at much more concrete ideas about the right-hand side than were available to Einstein. It is fair to say that some of it has come to look quite palatial, and fully worthy to stand alongside the left-hand side. These are the terms that involve field kinetic energy and gauge bosons, as described by the Standard Model (SM). Their form follows logically, within the framework of relativistic quantum field theory, directly from the principles of local gauge symmetry and relativity. Mathematically, they also speak the same geometric language as the right-hand side. The gauge bosons are avatars of curvature in “internal spaces”, similar to how gravitons are the avatars of space–time curvature. Internal spaces parameterise ways in which fields can vary – and thus, in effect, move – independently of ordinary motion in space–time. In this picture, the strong, weak and electromagnetic interactions arise from the influence of internal space curvature on internal space motion, similar to how gravity arises from the influence of space–time curvature on space–time motion.
The Higgs particle is the only portal connecting normal matter to such phantom fields
The other contributions to Tμν, all of which involve the Higgs particle, do not yet reach that standard. We can aspire to do better! They are of three kinds. First, there are the many Yukawa-like terms from which quark and lepton masses and mixings arise. Then there is the Higgs self-coupling and finally a term representing its mass. These contributions to Tμν contain almost two dozen dimensionless coupling parameters that present-day theory does not enable us to calculate or even much constrain. It is therefore important to investigate experimentally, through quantitative studies of Higgs-particle properties and interactions, whether this ramshackle structure describes nature accurately.
Higgs potential
The Higgs boson is special among the elementary particles. As the quantum of a condensate that fills all space, it is metaphorically “a fragment of vacuum”. Speaking more precisely, the Higgs particle has no spin, no electric or colour charge and, at the level of strong and electromagnetic interactions, normal charge conjugation and parity. Thus, it can be emitted singly and without angular momentum barriers, and it can decay directly into channels free of colour and electromagnetically charged particles, which might otherwise be difficult to access. For these and other, more technical, reasons, the Higgs particle has the potential to reveal new physical phenomena of several kinds.
A unique aspect of the Higgs mass term is especially promising for revealing possible shortcomings in the SM. In quantum field theory, an important property of an interaction is the “mass dimension” of the operator that implements it – a number that in an important sense indicates its complexity. Scalar and gauge fields have mass dimension 1 as do space–time derivatives, whereas fermion fields have mass dimension 3/2. More complicated operators are built up by multiplying these, and the mass dimension of a product is the sum of the mass dimensions of its factors. Interactions associated with operators whose mass dimension is greater than 4 are problematic because they lead to violent quantum fluctuations and mathematical divergences. Whereas all the other terms in the SM Lagrangian arise from operators of mass dimension 4, the Higgs mass term has mass dimension 2. Thus it is uniquely open to augmentation by couplings to hypothetical new SU(3) × SU(2) × U(1) singlet scalar fields, because the mass dimension of the augmented interaction can be 3 or 4 – i.e. still “safe”. The Higgs particle is the only portal connecting normal matter to such phantom fields.
Why is this an interesting observation? There are three main reasons: two broadly theoretical, one pragmatic. First of all, the particles that are generally considered part of the SM carry a variety of charge assignments under the gauge groups SU(3) × SU(2) × U(1) that govern the strong and electroweak interactions. For example, the left-handed up quark is charged under all three groups, while the right-handed electron carries only U(1) hypercharge. Thus it is not only logically possible, but reasonably plausible, that there could be particles that are neutral under all three groups. Such phantom particles might easily escape detection, since they do not participate in the strong or electroweak interactions. Indeed, there are several examples of well-motivated candidate particles of that kind. Axions are one. Since they are automatically “dark” in the appropriate sense, phantom particles could contribute to the astronomical dark matter, and might even dominate it, as model-builders have not failed to notice. Also, many models of unification bring in scalar fields belonging to representations of a unifying gauge group that contains SU(3) × SU(2) × U(1) singlets, as do models with supersymmetry. Only phantom scalars are directly accessible through the Higgs portal, but phantoms of higher spin, including right-handed neutrinos, could cascade from real or virtual scalars.
Mysterious values
Second, the empirical value of the Higgs mass term is somewhat mysterious and even problematic, given that quantum corrections should push it to a value many orders of magnitude higher. This is the notorious “hierarchy problem” (see Naturalness after the Higgs). Given this situation, it seems appropriate to explore the possibility that part (or all) of the effective mass-term of the SM Higgs particle arises from more fundamental couplings upon condensation of SU(3) × SU(2) × U(1) singlet scalar fields, i.e. the emergence of a non-zero space-filling field, as occurs in the Brout–Englert–Higgs mechanism.
The portal idea leads to concrete proposals for directions of experimental exploration
Third, the portal idea leads to concrete proposals for directions of experimental exploration. These are of two basic kinds: one involves the observed strength of conventional Higgs couplings, the other the kinematics of Higgs production and decay. Couplings of the Higgs field to singlets that condense will lead to mixing, altering numerical relationships among Higgs-particle couplings and masses of gauge bosons, and of fermions from their minimal SM values. Also, the Higgs-field couplings to gauge bosons and fermions will be divided among two or more mass eigenstates. Since existing data indicates that deviations from the minimal model are small, the coupling of normal matter to the “mostly but not entirely” singlet pieces could be quite small, perhaps leading to very long lifetimes (as well as small production rates) for those particles. Whether or not the phantom particles contribute significantly to cosmological dark matter, they will appear as missing energy or momentum accompanying Higgs particle decay or, through Bremsstrahlung-like processes, when they are produced.
We introduced the term “Higgs portal” to describe this circle of ideas in 2006, triggering a flurry of theoretical discussion. Now that the portal is open for business, and with larger data samples in store at the LHC, we can think more concretely about exploring it experimentally.
Often in physics, experimentalists observe phenomena that theorists had not been able to predict. When the muon was discovered, theoreticians were confused; a particle had been predicted, but not this one. Isidor Rabi came with his famous outcry: “who ordered that?” The J/ψ is another special case. A particle was discovered with properties so different from the particles that were expected, that the first guesses as to what it was were largely mistaken. Soon it became evident that it was a predicted particle after all, but it so happened that its features were more exotic than was foreseen. This was an experimental discovery requiring new twists in the theory, which we now understand very well. The Higgs particle also has a long and interesting history, but from my perspective, it was to become a triumph for theory.
From the 1940s, long before any indications were seen in experiments, there were fundamental problems in all theories of the weak interaction. Then we learned from very detailed and beautiful measurements that the weak force seemed to have a vector-minus axial-vector (V-A) structure. This implied that, just as in Yukawa’s theory for the strong nuclear force, the weak force can also be seen as resulting from an exchange of particles. But here, these particles had to be the energy quanta of vector and axial-vector fields, so they must have spin one, with positive and negative parities mixed up. They also must be very heavy. This implied that, certainly in the 1960s, experiments would not be able to detect these intermediate particles directly. But in theory, we should be able to calculate accurately the effects of the weak interaction in terms of just a few parameters, as could be done with the electromagnetic force.
Electromagnetism was known to be renormalisable – that is, by carefully redefining and rearranging the mass and interaction parameters, all observable effects would become calculable and predictable, avoiding meaningless infinities. But now we had a difficulty: the weak exchange particles differed from the electromagnetic ones (the photons) because they had mass. The mass was standing in the way when you tried to do what was well understood in electromagnetism. How exactly a correct formalism should be set up was not known, and the relationship between renormalisability and gauge invariance was not understood at all. Indeed, today we can say that the first hints were already there by 1954, when C N Yang and Robert Mills wrote a beautiful paper in which they generalised the principle of local gauge invariance to include gauge transformations that affect the nature of the particles involved. In its most basic form, their theory described photons with electric charge.
Thesis topic
In 1969 I began my graduate studies under the guidance of Martinus J G Veltman. He explained to me the problem he was working on: if photons were to have mass, then renormalisation would not work the same way. Specifically, the theory would fail to obey unitarity, a quantum mechanical rule that guarantees probabilities are conserved. I was given various options for my thesis topic, but they were not as fundamental as the issues he was investigating. “I want to work with you on the problem you are looking at now,” I said. Veltman replied that he had been working on his problem for almost a decade; I would need lots of time to learn about his results. “First, read this,” he said, and he gave me the Yang–Mills paper. “Why?” I asked. He said, “I don’t know, but it looks important.”
That, I could agreewith. This was a splendid idea. Why can’t you renormalise this? I had convinced myself that it should be possible, in principle. The Yang–Mills theory was a relativistic quantised field theory. But Veltman explained that, in such a theory, you must first learn what the Feynman rules are. These are the prescriptions that you have to follow to get the amplitudes generated by the theory. You can read off whether the amplitudes are unitary, obey dispersion relations, and check that everything works out as expected.
Many people thought that renormalisation – even quantum field theory – was suspect. They had difficulties following Veltman’s manipulations with Feynman diagrams, which required integrations that do not converge. To many investigators, he seemed to be sweeping the difficulties with the infinities under the rug. Nature must be more clever than this! Yang–Mills seemed to be a divine theory with little to do with reality, so physicists were trying all sorts of totally different approaches, such as S-matrix theory and Regge trajectories. Veltman decided to ignore all that.
Solid-state inspiration
Earlier in the decade, some investigators had been inspired by results from solid-state physics. Inside solids, vibrating atoms and electrons were described by nonrelativistic quantum field theories, and those were conceptually easier to understand. Philip Anderson had learned to understand the phenomenon of superconductivity as a process of spontaneous symmetry breaking; photons would obtain a mass, and this would lead to a remarkable rearrangement of the electrons as charge carriers that would no longer generate any resistance to electric currents. Several authors realised that this procedure might apply to the weak force. In the summer of 1964, Peter Higgs submitted a manuscript to Physical Review Letters, where he noted that the mechanism of making photons massive should also apply to relativistic particle systems. But there was a problem. Jeffrey Goldstone had sound mathematical arguments to expect the emergence of massless scalar particles as soon as a continuous symmetry breaks down spontaneously. Higgs put forward that this theorem should not apply to spontaneously broken local symmetries, but critics were unconvinced.
The journal sent Higgs’s manuscript out to be peer reviewed. The reviewer did not see what the paper would add to our understanding. “If this idea has anything to do with the real world, would there be any possibility to check it experimentally?” The correct question would have been what the paper would imply for the renormalisation procedure, but this question was in nobody’s mind. Anyway, Higgs gave a clear and accurate answer: “Yes, there is a consequence: this theory not only explains where the photon mass comes from, but it also predicts a new particle, a scalar particle (a particle with spin zero), which unlike all other particles, forms an incomplete representation of the local gauge symmetry.” In the meantime, other papers appeared about the photon mass-generation process, not only by François Englert and Robert Brout in Brussels, but also by Tom Kibble, Gerald Guralnik and Carl Hagen in London. And Sheldon Glashow, Abdus Salam and Steven Weinberg were formulating their first ideas (all independently) about using local gauge invariance to create models for the weak interaction.
I started to study everything from the ground up
At the time spontaneous symmetry breaking was being incorporated into quantum field theory, the significance of renormalisation and the predicted scalar particles were hardly mentioned. Certainly, researchers were not able to predict the mass of such particles. Personally, although I had heard about these ideas, I also wasn’t sure I understood what they were saying. I had my own ways of learning how to understand things, so I started to study everything from the ground up.
If you work with quantum mechanics, and you start from a relativistic classical field theory, to which you add the Copenhagen procedure to turn that into quantum mechanics, then you should get a unitary theory. The renormalisation procedure amounts to transforming all expressions that threaten to become infinite due to divergence of the integrals, to apply only to unobservable qualities of particles and fields, such as their “bare mass” and “bare charge”. If you understand how to get such things under control, then your theory should become a renormalised description of massive particles. But there were complications.
The infinities that require a renormalisation procedure to tame them originate from uncontrolled behaviour at very tiny distances, where the effective energies are large and consequently the effects of mass terms for the particles should become insignificant. This revealed that you first have to renormalise the theory without any masses in them, where also the spontaneous breakdown of the local symmetry becomes insignificant. You had to get the particle book-keeping right. A massless photon has only two observable field components (they can be left- or right-rotating), whereas a massive particle with the same spin can rotate in three different ways. One degree of freedom did not match. This was why an extra field was needed. If you wanted massive photons with electric charges +, 0 or –, you would need a scalar field with four components; one of these would represent the total field strength, and would behave as an extra, neutral, spin-0 particle – the observable particle that Higgs had talked about – but the others would turn the number of spinning degrees of freedom of the three other bosons from two to three each (see “Dynamical” figure).
One question
In 1970 Veltman sent me to a summer school organised by Maurice Lévy in a new science institute at Cargèse on the French island of Corsica. The subject would be the study of the Gell–Mann–Lévy model for pions and nucleons, in particular its renormalisation and the role of spontaneous symmetry breaking. Will renormalisation be possible in this model, and will it affect its symmetry? The model was very different from what I had just started to study: Yang–Mills theory with spontaneous breaking of its symmetry. There were quite a few reputable lecturers besides Lévy himself: Benjamin Lee and Kurt Symanzik had specialised in renormalisation. Shy as I was, I only asked one question to Lee, and the same to Symanzik: does your analysis apply to the Yang–Mills case?
Both gave me the same answer: if you are Veltman’s student, ask him. But I had, and Veltman did not believe that these topics were related. I thought that I had a better answer, and I fantasised that I was the only person on the planet who knew how to do it right. It was not obvious at all; I had two German roommates at the hotel where I had been put, who tried to convince me that renormalisation of Feynman graphs where lines cross each other would be unfathomably complicated.
Veltman had not only set up detailed, fully running machinery to handle the renormalisation of all sorts of models, but he had also designed a futuristic computer program to do the enormous amount of algebra required to handle the numerous Feynman diagrams that appear to be relevant for even the most basic computations. I knew he had those programs ready and running. He was now busy with some final checks: if his present attempts to check the unitarity of his renormalised model still failed, we should seriously consider giving this up. Yang–Mills theories for the weak interactions would not work as required.
But Veltman had not thought of putting a spin-zero, neutral particle in his model, certainly not if it wasn’t even in a complete representation of the gauge symmetry. Why should anyone add that? After returning from Cargèse I went to lunch with Veltman, during which I tried to persuade him. Walking back to our institute, he finally said, “Now look, what I need is not an abstract mathematical idea, what I want is a model, with a Lagrangian, from which I can read off the Feynman diagrams to check it with my program…”. “But that Lagrangian I can give you,” I said. Next, he walked straight into a tree! A few days after I had given him the Lagrangian, he came to me, quite excited. “Something strange,” he said, “your theory isn’t right because it still isn’t unitary, but I see that at several places, if the numbers had been a trifle different, it could have worked out.” Had he copied those factors ¼ and ½ that I had in my Lagrangian, I wondered? I knew they looked odd, but they originated from the fact that the Higgs field has isospin ½ while all other fields have isospin one.
No, Veltman had thought that those factors came from a sloppy notation I must have been using. “Try again,” I asked. He did, and everything fell into place. Most of all, we had discovered something important. This was the beginning of an intensive but short collaboration. My first publication “Renormalization of massless Yang–Mills fields”, published in October 1971, concerned the renormalisation of the Yang–Mills theory without the mass terms. The second publication that year, “Renormalizable Lagrangians for massive Yang–Mills fields,” where it was explained how the masses had to be added, had a substantial impact.
There was an important problem left wide open, however: even if you had the correct Feynman diagrams, the process of cancelling out the infinities could still leave finite, non-vanishing terms that ruin the whole idea. These so-called “anomalies” must also cancel out. We found a trick called dimensional renormalisation, which would guarantee that anomalies cancel except in the case where particles spin preferentially in one direction. Fortunately, as charged leptons tend to rotate in opposite directions compared to quarks, it was discovered that the effects of the quarks would cancel those of the leptons.
The fourth component
Within only a few years, a complete picture of the fundamental interactions became visible, where experiment and theory showed a remarkable agreement. It was a fully renormalisable model where all quarks and all leptons were represented as “families” that were only complete if each quark species had a leptonic counterpart. There was an “electroweak force”, where electromagnetism and the weak force interfere to generate the force patterns observed in experiments, and the strong force was tamed at almost the same time. Thus the electroweak theory and quantum chromodynamics were joined into what is now known as the Standard Model.
Be patient, we are almost there, we have three of the four components of this particle’s field
This theory agreed beautifully with observations, but it did not predict the mass of the neutral, spin-0 Higgs particle. Much later, when the W and the Z bosons were well-established, the Higgs was still not detected. I tried to reassure my colleagues: be patient, we are almost there, we have three of the four components of this particle’s field. The fourth will come soon.
As the theoretical calculations and the experimental measurements became more accurate during the 1990s and 2000s, it became possible to derive the most likely mass value from indirect Higgs-particle effects that had been observed, such as those concerning the top-quark mass. On 4 July 2012 a new boson was directly detected close to where the Standard Model said the Higgswould be. After these first experimental successes, it was of utmost importance to check whether this was really the object we had been expecting. This has kept experimentalists busy for the past 10 years, and will continue to do so for the foreseeable future.
The discovery of the Higgs particle is a triumph for high technology and basic science, as well as accurate theoretical analyses. Efforts spanning more than half a century paid off in the summer of 2012, and a new era of understanding the particles, their masses and interactions began.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.