Nearly 70 years ago, before CERN was established, two models for European collaboration in fundamental physics were on the table: one envisaged opening up national facilities to researchers from across the continent, the other the creation of a new, international research centre with world-leading facilities. Discussions were lively, until one delegate pointed out that researchers would go to wherever the best facilities were.
From that moment on, CERN became an accelerator laboratory aspiring to be at the forefront of technology to enable the best science. It was a wise decision, and one that I was reminded of while listening to the presentations at the open symposium of the European Strategy for Particle Physics in Granada, Spain, in May. Because among the conclusions of this very lively meeting was the view that providing world-leading accelerator and experimental facilities is precisely the role the community needs CERN to play today.
There was huge interest in the symposium, as witnessed by the 600-plus participants, including many from the nuclear and astroparticle physics communities. The vibrancy of the field was fully on display, with future hadron colliders offering the biggest leap in energy reach for direct searches for new physics. Precision electroweak studies at the few per cent level, particularly for the Higgs particle, will obtain sensitivities for similar mass scales. The LHC, and soon the High-Luminosity LHC, will go a long way towards achieving that goal of precision. Indeed, it’s remarkable how far the LHC experiments have come in overturning the old adage that hadrons are for discovery and leptons for precision – the LHC has established itself as a precision tool, and this is shaping the debate as to what kind of future we can expect.
Nevertheless, however precise proton–proton physics becomes, it will still fall short in some areas. To fully understand the absolute width of the Higgs, for example, a lepton machine will be needed, and no fewer than four implementations were discussed. So, one key conclusion is that if we are to cover all bases, no single facility will suffice. One way forward was presented by the chair of the Asian Committee for Future Accelerators Geoff Taylor, who advocated a lepton machine for Asia while Europe would focus on advancing the hadron frontier.
Interest in muon colliders was rekindled, not least because of some recent reconsiderations in muon cooling (CERN Courier July/August 2018 p19). The great and recent progress of plasma-wakefield accelerators, including AWAKE at CERN, calls for further research in this field so as to render the technology usable for particle physics. Methods of dark-matter searches abound and are an important element of the discussion on physics beyond colliders, using single beams at CERN.
The Granada meeting was a town meeting on physics. Yet, it is clear to all that we can’t make plans solely on the basis of the available technology and a strong physics case, but must also consider factors such as cost and societal impact in any future strategy for European particle physics. With all the available technology options and open questions in physics, there’s no doubt that the future should be bright. The European Strategy Group, however, has a monumental challenge in plotting an affordable course to propose to the CERN Council in March next year.
There were calls for CERN to diversify and lend its expertise to other areas of research, such as gravitational waves: one speaker even likened interferometers to accelerators without beams. In terms of the technologies involved, that statement stands up well to scrutiny, and it is true that technology developed for particle physics at CERN can help the advancement of other fields. CERN already formally collaborates with organisations like ITER and the ESS, sharing our innovation and expertise. However, for me, the strongest message from Granada is that it is CERN’s focus on remaining at the forefront of particle physics that has enabled the Organization to contribute to a diverse range of fields. CERN needs to remain true to that founding vision of being a world-leading centre for accelerator technology. That is the starting point. From it, all else follows.
This article was originally published in the CERN Bulletin.
In the mid-1970s, particle physics was hot. Quarks were in. Group theory was in. Field theory was in. And so much progress was being made that it seemed like the fundamental theory of physics might be close at hand. Right in the middle of all this was Murray Gell-Mann – responsible for not one, but most, of the leaps of intuition that had brought particle physics to where it was. There’d been other theories, but Murray’s, with their somewhat elaborate and abstract mathematics, were always the ones that seemed to carry the day.
It was the spring of 1978 and I was 18 years old. I’d been publishing papers on particle physics for a few years. I was in England, but planned to soon go to graduate school in the US, and was choosing between Caltech and Princeton. One weekend afternoon, the phone rang. “This is Murray Gell-Mann”, the caller said, then launched into a monologue about why Caltech was the centre of the universe for particle physics at the time. Perhaps not as star-struck as I should have been, I asked a few practical questions, which Murray dismissed. The call ended with something like, “Well, we’d like to have you at Caltech”.
I remember the evening I arrived, wandering around the empty fourth floor of Lauritsen Lab – the centre of Caltech theoretical particle physics. There were all sorts of names I recognised on office doors, and there were two offices that were obviously the largest: “M. Gell-Mann” and “R. Feynman”. In between them was a small office labelled “H. Tuck”, which by the next day I’d realised was occupied by the older but very lively departmental assistant.
I never worked directly with Murray but I interacted with him frequently while I was at Caltech. He was a strange mixture of gracious and gregarious, together with austere and combative. He had an expressive face, which would wrinkle up if he didn’t approve of what was being said. Murray always grouped people and things he approved of, and those he didn’t – to which he would often give disparaging nicknames. (He would always refer to solid- state physics as “squalid-state physics”.) Sometimes he would pretend that things he did not like simply did not exist. I remember once talking to him about something in quantum field theory called the beta function. His face showed no recognition of what I was talking about, and I was getting slightly exasperated. Eventually I blurted out, “But, Murray, didn’t you invent this?” “Oh”, he said, suddenly much more charming, “You mean g times the psi function. Why didn’t you just say that? Now I understand.”
I could never quite figure out what it was that made Murray impressed by some people and not others. He would routinely disparage physicists who were destined for great success, and would vigorously promote ones who didn’t seem so promising, and didn’t in fact do well. So when he promoted me, I was on the one hand flattered, but on the other hand concerned about what his endorsement might really mean.
Feynman interactions
The interaction between Murray Gell-Mann and Richard Feynman was an interesting thing to behold. Both came from New York, but Feynman relished his “working class” New York accent while Gell-Mann affected the best pronunciation of words from any language. Both would make surprisingly childish comments about the other. I remember Feynman insisting on telling me the story of the origin of the word “quark”. He said he’d been talking to Murray one Friday about these hypothetical particles, and in their conversation they’d needed a name for them. Feynman told me he said (no doubt in his characteristic accent), “Let’s call them ‘quacks’”. The next Monday, he said, Murray came to him very excited and said he’d found the word “quark” ina novel by James Joyce. In telling this to me, Feynman then went into a long diatribe about how Murray always seemed to think the names for things were so important. “Having a name for something doesn’t tell you a damned thing,” Feynman said. Feynman went on, mocking Murray’s concern for things like what different birds are called. (Murray was an avid bird watcher.) Meanwhile, Feynman had worked on particles that seemed (and turned out to be) related to quarks. Feynman had called them “partons”. Murray insisted on always referring to them as “put-ons”.
He was a strange mixture of gracious and gregarious
Even though in terms of longstanding contributions to particle physics, Murray was the clear winner, he always seemed to feel as if he was in the shadow of Feynman, particularly with Feynman’s showmanship. When Feynman died, Murray wrote a rather snarky obituary, saying of Feynman: “He surrounded himself with a cloud of myth, and he spent a great deal of time and energy generating anecdotes about himself.” I never quite understood why Murray – who could have gone to any university in the world – chose to work at Caltech for 33 years in an office two doors down from Feynman.
Murray cared a lot about what people thought of him, but wasn’t particularly good at reading other people. Yet, alongside the brush-offs and the strangeness, he could be personally very gracious. I remember him inviting me several times to his house. He also did me quite a few favours in my career. I don’t know if I would call Murray a friend, though, for example, after his wife Margaret died, he and I would sometimes have dinner together, at random restaurants around Pasadena. It wasn’t so much that I felt of a different generation from him (which of course I was). It was more that he exuded a certain aloof tension, that made one not feel very sure about what the relationship really was.
Murray Gell-Mann had an amazing run. For 20 years he had made a series of bold conjectures about how nature might work – strangeness, V-A theory, SU(3), quarks, QCD – and in each case he had been correct, while others had been wrong. He had one of the more remarkable records of repeated correct intuition in the whole history of science.
He tried to go on. He talked about “grand unification being in the air”, and (along with many other physicists) discussed the possibility that QCD and the theory of weak interactions might be unified in models based on groups such as SU(5) and SO(10). He considered supersymmetry. But quick validations of these theories didn’t work out, though even now it’s still conceivable that some version of them might be correct.
I have often used Murray as an example of the challenges of managing the arc of a great career. From his twenties to his forties, Murray had the golden touch. His particular way of thinking had success after success, and in many ways he defined physics for a generation. By the time I knew him, the easy successes were over. Perhaps it was Murray; more likely, it was just that the easy pickings from his approach were now gone. He so wanted to succeed as he had before, not just in physics but in other fields and endeavours. But he never found a way to do it – and always bore the burden of his early success.
Though Murray is now gone, the physics he discovered will live on, defining an important chapter in the quest for our understanding of the fundamental structure of our universe.
In April 1960, Prince Philip, husband of Queen Elizabeth II, piloted his Heron airplane to Geneva for an informal visit to CERN. Having toured the laboratory’s brand new “25 GeV” Proton Synchrotron (PS), he turned to his host, president of the CERN Council François de Rose, and struck at the heart of fundamental exploration: “What have you got in mind for the future? Having built this machine, what next?” he asked. De Rose replied that this was a big problem for the field: “We do not really know whether we are going to discover anything new by going beyond 25 GeV,” he said. Unbeknown to de Rose and everyone else at that time, the weak gauge bosons and other phenomena that would transform particle physics were lying not too far above the energy of the PS.
This is a story repeated in elementary particle physics, and which CERN Courier, celebrating its 60th anniversary this summer, offers a bite-sized glimpse of.
The first issue of the Courier was published in August 1959, just a few months before the PS switched on, at a time when accelerators were taking off. The PS was the first major European machine, quickly reaching an energy of 28 GeV, only to be surpassed the following year by Brookhaven’s Alternating Gradient Synchrotron. The March 1960 issue of the Courier described a meeting at CERN where 245 scientists from 28 countries had discussed “a dozen machines now being designed or constructed”. Even plasma-based acceleration techniques – including a “plasma betatron” at CERN – were on the table.
A time gone by
The picture is not so different today (see Granada symposium thinks big), though admittedly thinner on projects under construction. Some things remain eerily pertinent: swap “25 GeV” for “13 TeV” in de Rose’s response to Prince Philip, and his answer still stands with respect to what lies beyond the LHC’s energy. Other things are of a time gone by. The third issue of the Courier, in October 1959, proudly declared that “elementary particles number 32” (by 1966 that number had grown to more than 50 – see “Not so elementary”). Another early issue likened the 120 million Swiss Franc cost of the PS to “10 cigarettes for each of the 220 million inhabitants of CERN’s 12 Member States”.
The general situation of elementary particle physics back then, argued the August 1962 issue, could be likened to atomic physics in 1924 before the development of quantum mechanics. Summarising the 1962 ICHEP conference held at CERN, which attracted an impressive 450 physicists from 158 labs in 39 countries, the leader of the CERN theory division Léon Van Hove wrote: “The very fact that the variety of unexpected findings is so puzzling is a promise that new fundamental discoveries may well be in store at the end of a long process of elucidation.” Van Hove was right, and the 1960s brought the quark model and electroweak theory, laying a path to the Standard Model. Not that this paradigm shift is much apparent when flicking through issues of the Courier from the period; only hindsight can produce the neat logical history that most physics students learn.
Within a few years of PS operations, attention soon turned to a machine for the 1970s. A report on the 24th session of the CERN Council in the July 1963 issue noted ECFA’s recommendation that high priority be given to the construction in Europe of two projects: a pair of intersecting storage rings (ISR, which would become the world’s first hadron collider) and a new proton accelerator of a very high energy “probably around 300 GeV”, which would be 10 times the size of the PS (and eventually renamed the Super Proton Synchrotron, SPS). Mervyn Hine of the CERN directorate for applied physics outlined in the August 1964 issue how this so-called “Summit program” should be financed. He estimated the total annual cost (including that of the assumed national programmes) to be about 1100 million Swiss Francs by 1973, concluding that this was in step with a minimum growth for total European science. He wrote boldly: “The scientific case for Europe’s continuing forcefully in high-energy physics is overwhelming; the equipment needed is technically feasible; the scientific manpower needed will be available; the money is trivial. Only conservatism or timidity will stop it.”
The development of science
Similar sentiments exist now in view of a post-LHC collider. There is also nothing new, as the field grows ever larger in scale, in attacks on high-energy physics from outside. In an open letter published in the Courier in April 1964, nuclear physicist Alvin Weinberg argued that the field had become “remote” and that few other branches of science were “waiting breathlessly” for insights from high-energy physics without which they could not progress. Director-General Viki Weisskopf, writing in April 1965, concluded that the development of science had arrived at a critical stage: “We are facing today a situation where it is threatened that all this promising research will be slowed down by constrained financial support of high-energy physics.”
Deciding where to build the next collider and getting international partners on board was also no easier in the past, if the SPS was any guide. The September 1970 issue wrote that the “present impasse in the 300 GeV project” is due to the difficulty of selecting a site and: “At the same time it is disturbing to the traditional unity of CERN that only half the Member States (Austria, Belgium, Federal Republic of Germany, France, Italy and Switzerland) have so far adopted a positive attitude towards the project.” That half-a-century later, the SPS, soon afterwards chosen to be built at CERN, would be feeding protons into a 27 km-circumference hadron collider with a centre-of-mass energy of 13 TeV was unthinkable.
A giant LEP for mankind
An editorial in the January/February 1990 issue of the Courier titled “Diary of a dramatic decade” summed up a crucial period that had the Large Electron Positron (LEP) collider at its centre: Back in 1980, it said, the US was the “mecca” of high-energy physics. “But at CERN, the vision of Carlo Rubbia, the invention of new beam ‘cooling’ techniques by Simon van der Meer, and bold decisions under the joint Director-Generalship of John Adams and Léon Van Hove had led to preparations for a totally new research assault – a high-energy proton–antiproton collider.” The 1983 discoveries of the W and Z bosons had, it continued, “nudged the centroid of particle physics towards Europe,” and, with LEP and also HERA at DESY operating, Europe was “casting off the final shackles of its war-torn past”.
Despite involving what at that time was Europe’s largest civil-engineering project, LEP didn’t appear to attract much public attention. It was planned to be built within a constant CERN budget, but there were doubts as to whether this was possible (see Lessons from LEP). The September 1983 issue reported on an ECFA statement noting that reductions in CERN’s budget had put is research programme under “severe stress”, impairing the lab’s ability to capitalise on its successful proton–antiproton programme. “The European Laboratories have demonstrated their capacity to lead the world in this field, but the downward trend of support both for CERN and in the Member States puts this at serious risk,” it concluded. At the same time, following a famous meeting in Lausanne, the ECFA report noted that proton–proton collision energies of the order of 20 TeV could be reached with superconducting magnets in the LEP tunnel and “recommends that this possibility be investigated”.
Physicists were surprisingly optimistic about the possibility of such a large hadron collider. In the October 1981 issue, Abdus Salam wrote: “In the next decade, one may envisage the possible installation of a pp̅ collider in the LEP tunnel and the construction of a supertevatron… But what will happen to the subject 25 years from now?” he asked. “Accelerators may become as extinct as dinosaurs unless our community takes heed now and invests efforts on new designs.” Almost 40 years later, the laser-based acceleration schemes that Salam wrote of, and others, such as muon colliders, are still being discussed.
Accelerator physicist Lee Teng, in an eight-page long report about the 11th International Conference on High Energy Accelerators in the September 1980 issue, pointed out that seven decades in energy had been mastered in 50 years of accelerator construction. Extrapolating to the 21st century, he envisaged “a 20 TeV proton synchrotron and 350 GeV electron–positron linear colliders”. On CERN’s 50th anniversary in September 2004, former Director-General Luciano Maiani predicted what the post-2020 future might look like, asserting that “a big circular tunnel, such as that required by a Very Large Hadron Collider, would have to go below Lake Geneva or below the Jura (or both). Either option would be simply too expensive to consider. This is why a 3–5 TeV Compact Linear Collider (CLIC) would be the project of choice for the CERN site.” It is the kind of decision that the current CERN management is weighing up today, 15 years later.
Driving success
This collider-centric view of 60 years of CERN Courier does little justice to the rest of the magazine’s coverage of fixed-target physics, neutrino physics, cosmology and astrophysics, detector and accelerator physics, computing, applications, and broader trends in the field. It is striking how much the field has advanced and specialised. Equally, it is heartening to find so many parallels with today. Some are sociological: in October 1995 a report on an ECFA study noted “much dissatisfaction” with long author lists and practical concerns about the size of the even bigger LHC-experiment collaborations over the horizon. Others are more strategic.
It is remarkable to read through back issues of the Courier from the mid-1970s to find predictions for the masses of the W and Z bosons that turned out to be correct to within 15%. This drove the success of the Spp̅S and LEP programmes and led naturally to the LHC – the collider to hunt down the final piece of the electroweak jigsaw, the “so-called Higgs mesons” as a 1977 issue of the Courier put it. Following the extraordinary episode that was the development and completion of the Standard Model, we find ourselves in a similar position as we were in the PS days regarding what lies over the energy frontier. Looking back at six decades of fundamental exploration as seen through the imperfect lens of this magazine, it would take a bold soul to claim that it isn’t worth a look.
Murray Gell-Mann’s scientific career began at the age of 15, when he received a scholarship from Yale University that allowed him to study physics. Afterwards he went to the Massachusetts Institute of Technology and worked under Victor Weisskopf. He completed his PhD in 1951, at the age of 21, and became a postdoc at the Institute for Advanced Study in Princeton.
The following year, Gell-Mann joined the research group of Enrico Fermi at the University of Chicago. He was particularly interested in the new particles that had been discovered in cosmic rays, such as the six hyperons and the four K-mesons. Nobody understood why these particles were created easily in collisions of nucleons, yet decayed rather slowly. To understand the peculiar properties of the new hadrons, Gell-Mann introduced a quantum number, which he called strangeness (S): nucleons were assigned S = 0; the Λ hyperon and the three Σ hyperons were assigned S = (–1); the two Ξ hyperons had S = (–2); and the negatively charged K-meson had S = (–1).
Strange assumptions
Gell-Mann assumed that the strangeness quantum number is conserved in the strong and electromagnetic interactions, but violated in the weak interactions. The decays of the strange particles into particles without strangeness could only proceed via the weak interaction.
The idea of strangeness thus explained, in a simple way, the production and decay rates of the newly discovered hadrons. A new particle with S = (–1) could be produced by the strong interaction together with a particle with S = (+1) – e.g. a negatively charged Σ can be produced together with a positively charged K meson. However, a positively charged Σ could not be produced together with a negatively charged K meson, since both particles have S = (–1).
In 1954 Gell-Mann and Francis Low published details of the renormalisation of quantum electrodynamics (QED). They had introduced a new method called the renormalisation group, which Kenneth Wilson (a former student of Gell-Mann) later used to describe the phase transitions in condensed-matter physics. Specifically, Gell-Mann and Low calculated the energy dependence of the renormalised coupling constant. In QED the effective coupling constant increases with the energy. This was measured at the LEP collider at CERN, and found to agree with the theoretical prediction.
In 1955 Gell-Mann went to Caltech in Pasadena, on the invitation of Richard Feynman, and was quickly promoted to full professor – the youngest in Caltech’s history. In 1957, Gell-Mann started to work with Feynman on a new theory of the weak interaction in terms of a universal Fermi interaction given by the product of two currents and the Fermi constant. These currents were both vector currents and axial-vector currents, and the lepton current is a product of a charged lepton field and an antineutrino field. The “V–A” theory showed that since the electrons emitted in a beta-decay are left-handed, the emitted antineutrinos are right-handed – thus parity is not a conserved quantum number. Some experiments were in disagreement with the new theory. Feynman and Gell-Mann suggested in their paper that these experiments were wrong, and it turned out that this was the case.
In 1960 Gell-Mann invented a new symmetry to describe the new baryons and mesons found in cosmic rays and in various accelerator experiments. He used the unitary group SU(3), which is an extension of the isospin symmetry based on the group SU(2). The two nucleons and the six hyperons are described by an octet representation of SU(3), as are the eight mesons. Gell-Mann often described the SU(3)-symmetry as the “eightfold way” in reference to the eightfold path of Buddhism. At that time, it was known that there exist four Δ resonances, three Σ resonances and two χ resonances. There is no SU(3)-representation with nine members, but there is a decuplet representation with 10 members. Gell-Mann predicted the existence and the mass of a negatively charged 10th particle with strangeness S = (–3), which he called the Ω particle.
The Ω is unique in the decuplet: due to its strangeness it could only decay by the weak interaction, and so would have a relatively long lifetime. This particle was discovered in 1964 by Nicholas Samios and his group at Brookhaven National Laboratory, at the mass Gell-Mann had predicted. The SU(3) symmetry was very successful and in 1969 Gell-Mann received the Nobel Prize in Physics “for his contributions and discoveries concerning the classification of elementary particles and their interactions.”
In 1962 Gell-Mann proposed the algebra of currents, which led to many sum rules for cross sections, such as the Adler sum rule. Current algebra was the main topic of research in the following years and Gell-Mann wrote several papers with his colleague Roger Dashen on the topic.
Quark days
In 1964 Gell-Mann discussed the triplets of SU(3), which he called “quarks”. He proposed that quarks were the constituents of baryons and mesons, with fractional electric charges, and published his results in Physics Letters. Feynman’s former PhD student George Zweig, who was working at CERN, independently made the same proposal. But the quark model was not considered seriously by many physicists. For example, the Ω is a bound state of three strange quarks placed symmetrically in an s-wave, which violated the Pauli principle since it was not anti-symmetric. In 1968, quarks were found indirectly in deep-inelastic electron–proton experiments performed at SLAC.
By then it had been proposed, by Oscar Greenberg and by Moo-Young Han and Yoichiro Nambu, that quarks possess additional properties that keep the Pauli principle intact. By imagining the quarks in three “colours” – which later came to be called red, green and blue – hadrons could be considered as colour singlets, the simplest being the bound states of a quark and an antiquark (meson) or of three quarks (baryon). Since baryon wave functions are antisymmetric in the colour index, there is no problem with the Pauli principle. Taking the colour quantum number as a gauge quantum number, like the electric charge in QED, yields a gauge theory of the strong interactions: colour symmetry is an exact symmetry and the gauge bosons are massless gluons, which transform as an octet of the colour group. Nambu and Han had essentially arrived at quantum chromodynamics (QCD), but in their model the quarks carried integer electrical charges.
The quark model was not considered seriously by many physicists
I was introduced to Gell-Mann by Ken Wilson in 1970 at the Aspen Center of Physics, and we worked together for a period. In 1972 we wrote down a model in which the quarks had fractional charges, proposing that, since only colour singlets occur in the spectrum, fractionally charged quarks remain unobserved. The discovery in 1973 by David Gross, David Politzer and Frank Wilczek that the self-interaction of the gluons leads to asymptotic freedom – whereby the gauge coupling constant of QCD decreases if the energy is increased – showed that quarks are forever confined. It was rewarded with the 2004 Nobel Prize in Physics, although a rigorous proof of quark confinement is still missing.
Gell-Mann did not just contribute to the physics of strong interactions. In 1979, along with Pierre Ramond and Richard Slansky, he wrote a paper discussing details of the “seesaw mechanism” – a theoretical proposal to account for the very small values of the neutrino masses introduced a couple of years earlier. After 1980 he also became interested in string theory. His wide-ranging interests in languages, and other areas beyond physics are also well documented.
I enjoyed working with Murray Gell-Mann. We had similar interests in physics, and we worked together until 1976 when I left Caltech and went to CERN. He visited often. In May 2019, during a trip to Los Alamos Laboratory, I was fortunate to have had the chance to visit Murray at his house in Santa Fe one last time.
One of the 20th century’s most amazing brains has stopped working. Nobel laureate Murray Gell-Mann died on 24 May at the age of 89. It is impossible to write a complete obituary of him, since he had so many dimensions that some will always be forgotten or neglected.
Murray was the leading particle theorist in the 1950s and 1960s in a field that had attracted the brightest young stars of the post-war generation. But he was also a polyglot who could tell you any noun in at least 25 languages, a walking encyclopaedia, a nature lover and a protector of endangered species, who knew all the flowers and birds. He was an early environmentalist, but he was so much more. It has been one of the biggest privileges in my life to have worked with him and to have been a close friend of his.
Murray Gell-Mann was born into a Jewish immigrant family in New York six weeks before the stock-market crash of October 1929. He was a trailing child, with a brother who was nine years older and relatively aged parents. He used to joke that he had been born by accident. His father had failed his studies and, after Murray’s birth, worked as a guard in a bank vault. Murray was never particularly close to father, but often talked about him.
Child prodigy
According to family legend, the first words that Murray spoke were “The lights of Babylon”, when he was looking at the night sky over New York at the age of two. At three, he could read and multiply large numbers in his head. At five he could correct older people about their language and in discussions. His interest for numismatics had already begun: when a friend of the family showed him what he claimed was a coin from Emperor Tiberius’ time, Murray corrected the pronunciation and said it was not from that time. At the age of seven, he participated in – and won – a major annual spelling competition in New York for students up to the age of 12. The last word that only he could spell and explain was “subpoena”, also citing its Latin origins and correcting the pronunciation of the moderator.
By the age of nine he had essentially memorised the Encyclopaedia Britannica. The task sounds impossible, but some of us did a test behind his back once in the 1970s. The late Myron Bander had learnt and studied an incomprehensible word and steered the discussion on to it over lunch. Of course Murray knew what the word was. He even recalled the previous and subsequent entries on the page.
Murray’s parents didn’t know what to do with him, but his piano teacher (music was not his strong side) made them apply for a scholarship so that he could start at a good private school. He was three years younger than his classmates, yet they always looked to him to see if he approved of what the teachers said. His tests were faultless, except for the odd exception. Once he came home and had scored “only” 97%, to which his father said: How could you miss this? His brother, who was more “normal”, was a great nature lover and became a nature photographer and later a journalist. He taught Murray about birds and plants, which would become a lifelong passion.
At the age of 15, he finished high school and went to Yale. He did not know which subject he would choose as a major, since he was interested in so many subjects. It became physics, partly to please his father who had insisted on engineering such that he could get a good job. He then went to MIT for his doctoral studies, receiving the legendary Victor “Viki” Weisskopf as his advisor. Murray wanted to do something pioneering, but he didn’t succeed. He tried for a whole semester and at the same time studied Chinese and learnt enough characters to read texts. He finally decided to present a thesis in nuclear physics, which was approved but that he never wanted to talk about. When Weisskopf, later in life, was asked what his biggest contribution to physics was, he answered: “Murray Gell-Mann”.
At the age of 21 Murray was ready to fly and went to the Institute for Advanced Study (IAS) as one of Robert Oppenheimer’s young geniuses. In the next year he went to the University of Chicago under Enrico Fermi, first as an instructor and in a few years became an associate professor. Even though he had not yet produced outstanding work, when he came to Chicago he was branded as a genius. At the IAS he had started to work on particle physics. He collaborated with Francis Low on renormalisation and realised that the coupling constant in a renormalisable quantum field theory runs with energy. As would happen so often, he procrastinated with the publication until 1954, by which time Petermann and Stückelberg had published this result.
This was during the aftermath of QED and Gell-Mann wanted to attack the strong interactions. He started his odyssey to classify all the new particles and introduced the concept of “strangeness” to specify the kaons and the corresponding baryons. This was also done independently by Kazuhiko Nishijima. When he was back at the IAS in 1955, Murray solved the problem with KL and Ks, the two decay modes of the neutral kaons in modern language (better known as the τ–θ puzzle). According to him, he showed this to Abraham Pais who said, “Why don’t we publish it?”, which they did. They were never friends after that. Murray also once told me that this was the hardest problem that he had solved.
A cavalcade of results
Aged 26, he lectured at Caltech on his renormalisation and kaon work. Richard Feynman, who was the greatest physicist at the time, said that he thought he knew everything, but these things he did not know. Feynman immediately said that Murray had to come to Caltech and dragged him to the dean. A few weeks later, he was a full professor. A large cavalcade of new results began to come out. Because he had difficulty relinquishing his works, they numbered just a few a year. But they were like cathedrals, with so many new details that he came to dominate modern particle physics.
After the ground-breaking work of T D Lee and C N Yang on parity violation in the weak interactions, Gell-Mann started to work on a dynamical theory – as did Feynman. In the end the dean of the faculty forced them to publish together, and the V–A theory was born. George Sudarshan and Robert Marshak also published the same result, and there was a long-lasting fight about who had told who before. Murray’s part of the paper, which is the second half, is also a first sketch of the Standard Model, and every sentence is worth reading carefully. It takes students of exegetics to unveil all the glory of Murray’s texts. Murray was to physics writing what Joseph Conrad was to novel writing!
Sometimes there are people born with all the neurons in the right place
Murray then turned back to the strong interactions and, with Maurice Lévy, developed the non-linear sigma model for pion physics to formulate the partially conserved axial vector current (PCAC). This was published within days of Yoichiro Nambu’s ground-breaking paper where he understood pion physics and PCAC in terms of spontaneous breaking of the chiral symmetry. In a note added to the proof they introduced a “funny” angle to describe the decay of 14O, which a few years later became the Cabibbo angle in Nicola Cabibbo’s universal treatment of the weak interactions.
Gell-Mann then made the great breakthrough when he classified the strongly interacting particles in terms of families of SU(3), a discovery also made by Yuval Ne’eman. The paper was never published in a journal and he used to joke that one day he would find out who rejected it. With this scheme he could predict the existence of the triply strange Ω– baryon, which was discovered in 1964 right where he predicted it would be. It paved the way for Gell-Mann’s suggestion in 1963 that all the baryons were made up of three fundamental particles, which in the published form he came to call quarks, after a line in James Joyce’s Finnegans Wake, “three quarks for Muster Mark”. The same idea was also put forward by George Zweig who called them “aces”. It was a very difficult thing for Murray to propose such a wild idea, and he formulated it extremely carefully to leave all doors open. Again, his father’s approval loomed in the background.
With the introduction of current algebra he had laid the ground for the explosion in particle theory during the 1970s. In 1966, Weisskopf’s 60th birthday was celebrated, and somehow Murray failed to show up. When he later received the proceedings, he was so ashamed that he did not open it. Had he done so, he would have found Nambu’s suggestion of a non-abelian gauge field theory with coloured quarks for the strong interactions. Nambu did not like fractional charges so he had given the quarks integer charges. Murray later said that, had he read this paper, he would have been able to formulate QCD rather quickly.
Legacy
When, at the age of 40 in 1969, he received the Nobel Prize in Physics as the sole recipient, he had been a heavily nominated candidate for the previous decade. Next year the Nobel archives for this period will be open, and scholars can study the material leading up to the prize. Unfortunately, his father had died a few weeks before the prize announcement. Murray once said to me, “If my father had lived two weeks longer, my life would have been different.”
During the 1950s and 1960s Gell-Mann had often been described in the press as the world’s most intelligent man. With a Nobel Prize in his pocket, the attraction to sit on various boards and committees became too strong to resist. His commitment to conserving endangered species also took up more of his time. Murray had also become a great collector of pre-Columbian artefacts and these were often expensive and difficult to obtain.
In the 1970s, he was displaced from the throne by people from the next generation. Murray was still the one invited to give the closing lectures at major conferences, but his own research started to suffer somewhat. In the mid-1970s, I came to Caltech as a young postdoctoral fellow. I had met him in a group before, but trembled like an aspen leaf when I first met him there. He had, of course, found out from where in Sweden I came and pronounced my name just right, and demanded that everyone else in the group do so. Pierre Ramond also arrived as a postdoc at that time, having been convinced by Murray to leave his position at Yale. After a few months we started to work together on supergravity. We did the long calculations, since Murray was often away. But he always contributed and could spot any weak links in our work immediately. Once, when we were in the middle of solving a problem after a period of several days, he came in and looked at what we did and wrote the answer on the board. Two days later we came to exactly that result. John Schwarz, who was a world champion in such calculations, was impressed and humbled.
When I left Caltech I got a carte blanche from Murray to return as often as I wanted, during which I worked with Schwarz and Michael Green developing string theory. Murray was always very positive about our work, which few others were. It was entirely thanks to him that we could develop the theory. Eventually, I couldn’t go to the US quite as often. Murray had also lost his wife in the early 1980s and never really recovered from this. In the mid-1980s he got the chance to set up a new institute in Santa Fe, which became completely interdisciplinary. He loved nature in New Mexico and here he could work on the issues that he now preferred, such as linguistics and large-scale order in nature. He dropped particle physics but was always interested in what happened in the field. Edward Witten had taken over the leadership of fundamental physics and Murray could not compete there.
Being considered the world’s most intelligent person did not make Murray very happy. He had trouble finding real friends among his peers. They were simply afraid of him. I often saw people looking away. The post-war research world is a single great world championship. For us who were younger, it was so obvious that he was intellectually superior to us that we were not disturbed by it. All the time, though, the shadow of his father was sitting on his shoulder, which led him too often to show off when he did not need to.
Sometimes people are born with all the neurons in the right place. We sometimes hear about the telephone-directory geniuses or people who know railway schedules by heart, but who otherwise are intellectually normal, if not rather weak. The fact that a few of them every century also get the neurons to make them intellectually superior is amazing. Among all Nobel laureates in physics, Murray Gell-Mann stands out. Others have perhaps done just as much in their research in physics and may be remembered longer, but I do not think that anyone had such a breadth in their knowledge. John von Neumann, the Hungarian–American mathematician who, among other things, was the first to construct a computer was another such universal genius. He could show off knowing Goethe by heart and on his death bed he cited the first sentence on each page of Faust for his brother. Murray was certainly a pain for American linguists, as he could say so many words in so many languages that he could always gain control over a discussion.
There are so many more stories that I could tell. Once he told me “Just think what I could have done if I had worked more with physics.” His almost crazy interest in so many areas took a lot of time away from physics. But he will still be remembered, I hope, as one of the great geniuses of the 20th century.
The 10th edition of the CERN–Latin- American School of High-Energy Physics (CLASHEP) hosted 75 students from 13 to 26 March in Villa General Belgrano in the Argentinian province of Cordoba. CLASHEP is a biennial series that takes place in different Latin-American locations. Since the first school in 2001, there has been a dramatic increase in the involvement of Latin-American groups in experimental HEP, including collaboration in the ALICE, ATLAS, CMS and LHCb experiments at CERN. The schools have played an important role in fostering this increased interest and participation in HEP in the region, as well as reinforcing existing activities and training young scientists.
The first schools in 2001 and 2003 took place in Brazil and Mexico, two countries in Latin America that already had substantial involvement in experimental HEP, followed by Argentina in 2005. María Teresa Dova of the Universidad Nacional de La Plata (UNLP) recalled that this first Argentinian school was a “strong catalyst” for Latin-American groups joining the LHC experimental programme. In due course, both UNLP and the Universidad de Buenos Aires formally joined ATLAS with support from the national funding agencies ANPCyT and CONICET.
The fourth school in Chile in 2007 gave unprecedented visibility for CERN and the LHC in a country which, until then, had no experimental HEP activity. Claudio Dib, the local director of the school, remarked that this was a key event in reaching agreements for the inclusion of Chile in the ATLAS experiment, and CERN and ATLAS representatives who were present were personally introduced to the authorities of the universities and the national funding agency, Conicyt. Following the fifth event in Colombia, in 2009, where there were also constructive meetings with the national funding agency and universities, the school returned to Brazil for a second time in 2011.
The Pontificia Universidad Católica del Perú celebrated the seventh school in Peru in 2013 with a special supplement of the university magazine dedicated to the work of local school director Alberto Gago’s group, which participates in the ALICE experiment and in neutrino experiments at Fermilab. Gago commented that the impact of the school had been “impressive and far beyond [his] expectations”. Similarly, discussions connected with the eighth school in Ecuador in 2015 were very important in stimulating interest in HEP within the universities and government agencies. This advanced the plans for the Escuela Politécnica Nacional and the Universidad San Francisco de Quito (USFQ) to join the CMS collaboration, supported by the national funding agency, Senescyt. USFQ’s rector Carlos Montúfar Freile described the school as a milestone for physics in Ecuador. In 2017 the school returned to Mexico for a second time, with strong interest and encouragement from the national funding agency, CONACyT.
There has been a dramatic increase in the involvement of Latin-American groups in experimental HEP
The 75 students attending this year’s school were of 17 different nationalities and more than 30% were women. Most came from universities in Latin America, while 15 were from European institutes. Lectures on HEP theory and experiment were given by leading scientists from both sides of the Atlantic, with special lectures on gravitational waves and cosmological collider physics by prominent Argentinian physicists Gabriela González (spokesperson of LIGO when gravitational waves were discovered in 2016) and Juan Martín Maldacena (winner of the 2012 Breakthrough Prize in Fundamental Physics). In addition to 50 hours reserved for plenary lectures, parallel group discussions were held for 90 minutes most afternoons. CERN Director-General Fabiola Gianotti took part in a lively Q&A session by video link.
The school also received visits from senior representatives of the Universidad Nacional de Córdoba (UNC), including Gustavo Monti, who is president of the Argentinean Physical Society, and Francisco Tamarit, a director of the national research council CONICET.
Building on the tradition of the last few schools in the series, outreach activities were organised at UNC in the city of Cordoba. María Teresa Dova from UNLP, again the local director of the school, explained experimental particle physics to a general audience, and Juan Martín Maldacena, who was awarded an honorary doctorate, talked about black holes and the structure of space–time.
On 20 May, 144 years after the signing of the Metre Convention in 1875, the kilogram was given a new definition based on Planck’s constant, h. Long tied to the International Prototype of the Kilogram (IPK) – a platinum-iridium cylinder in Paris – the kilogram is the last SI base unit to be redefined based on fundamental constants or atomic properties rather than a human-made artefact.
The dimensions of h are m2 kg s–1. Since the second and the metre are defined in terms of a hyperfine transition in caesium-133 and the speed of light, knowledge of h allows the kilogram to be set without reference to the IPK.
Measuring h to a suitably high precision of 10 parts per billion required decades of work by international teams across continents. In 1975 British physicist Bryan Kibble proposed a device, then known as a watt balance and now renamed the Kibble balance in his honour, which linked h to the unit of mass. A coil is placed inside a precisely calibrated magnetic field and a current driven through it such that an electromagnetic force on the coil counterbalances the force of gravity. The experiment is then repeated thousands of times over a period of months in multiple locations. The precision required is such that the strength of the gravitational field, which varies across the laboratory, must be measured before each trial.
Once the required precision was achieved, the value of h could be fixed and the definitions inverted, removing the kilogram’s dependence on the IPK. Following several years of deliberations, the new definition was formally adopted at the 26th General Conference on Weights and Measures in November last year. The 2019 redefinition of the SI base units came into force in May, and also sees the ampere, kelvin and mole redefined by fixing the numerical values for the elementary electric charge, the Boltzmann constant and the Avogadro constant, respectively.
“The revised SI future-proofs our measurement system so that we are ready for all future technological and scientific advances such as 5G networks, quantum technologies and other innovations that we are yet to imagine,” says Richard Brown, head of metrology at the UK’s National Physical Laboratory.
But the SI changes are controversial in some quarters. While heralding the new definition of the kilogram as “huge progress”, CNRS research director Pierre Fayet warns of possible pitfalls of fixing the value of the elementary charge: the vacuum magnetic permeability (μo) then becomes an unfixed parameter to be measured experimentally, with the electrical units becoming dependent on the fine structure constant. “It appears to me as a conceptual weakness of the new definitions of electrical units, even if it does not have consequences for their practical use,” says Fayet.
One way out of this, he suggests, is to embed the new SI system within a larger framework in which c = ħ = μo = εo = 1, thereby fixing the vacuum magnetic permeability and other characteristics of the vacuum (C. R. Physique20 33). This would allow all the units to be expressed in terms of the second, with the metre and joule identified as fixed numbers of seconds and reciprocal seconds, respectively. While likely attractive to high-energy physicists, however, Fayet accepts that it may be some time before such a proposal could be accepted.
In the heart of Beirut in a five-storey house owned by the Lebanese national telecommunication company, floors are about to be coated to make them anti-static, walls and ceilings will be insulated, and cabling systems installed so wires don’t become tangled. These and other details are set to be complete by mid-2020, when approximately 3000 processor cores, donated by CERN, will arrive.
The High-Performance Computing for Lebanon (HPC4L) project is part of efforts by Lebanese scientists to boost the nation’s research capabilities. Like many other countries that have been through conflict and seen their highly-skilled graduates leave to seek better opportunities, Lebanon is trying to stem its brain-drain. Though the new facility will not be the only HPC centre in the country, it is different because it involves both public and private institutions and has the full support of the government. “There are a few small-scale HPC facilities in different universities here, but they suffer from being isolated and hence are quickly outdated and underused,” says physicist Haitham Zaraket of Lebanese University in Beirut. “This HPC project puts together the main players in the realm of HPC in Lebanon.”
Having joined the LHC’s CMS experiment in 2016, Lebanese physicists want to develop the new facility into a CMS Tier-2 computing centre. High-speed internet will connect it to universities around the world and HPC4L has a mandate to ensure operation, maintenance, and user-interfacing for smooth and effective running of the facility. “We’ve been working with the government, private and public partners to prepare not just the infrastructure but also the team,” explains HPC4L project coordinator Martin Gastal of CERN. “CERN/CMS’s expertise and knowledge will help set up the facility and train users, but the team in Lebanon will run it themselves.” The Lebanese facility will also be used for computational biology, oil and gas discovery, financial forecasting, genome analysis and the social sciences.
Nepal is another country striving for greater digital storage and computing power. In 2017 Nepal signed a cooperation agreement with CERN. The following year, around 2500 cores from CERN enabled an HPC facility to be established at the government-run IT Park, with experts from Kathmandu University forming its core team. Rajendra Adhikari, project leader of Nepal’s HPC centre (pictured, second from right), also won an award from NVIDIA for the latest graphicscard worth USD 3000 and added it to the system. Nepal has never had computing on such a scale before, says Adhikari. “With this facility, we can train our students and conduct research that requires high-performance computing and data storage, from climate modelling, earthquake simulations to medical imaging and basic research.”
The Nepal facility is planning to store health data from hospitals, which is often deleted because of lack of storage space, and tests are being carried out to process drone images taken to map topography for hydropower feasibility studies. Even in the initial phases of the new centre, says Adhikari, computing tasks that used to take 45 days can now be processed in just 12 hours.
The SESAME light source in Jordan, which itself received 576 cores from CERN in 2017, is also using its experience to assist neighbouring regions in setting up and maintaining HPC facilities. “High-performance computing is a strong enabler of research capacity building in regions challenged by limited financial resources and talent exodus,” says Gastal. “By supporting the set up of efficient data processing and storage facilites, CERN, together with affiliated institutes, can assist fellow researchers in investing in the scientific potential of their own countries.”
An event held at CERN on 20–21 May revealed 170 projects that have been granted €100,000 of European Union (EU) funding to develop disruptive detection and imaging technologies. The successful projects, drawn from more than 1200 proposals from researchers in scientific and industrial organisations across the world, now have one year to prove the scientific merit and innovation potential of their ideas.
The 170 funded projects are part of the Horizon 2020 ATTRACT project funded by the EU and a consortium of nine partners, including CERN, the European Southern Observatory (ESO), European Synchrotron Radiation Facility (ESRF), European XFEL and Institut Laue-Langevin. The successful projects are grouped into four broad categories: data acquisition systems and computing; front-end and back-end electronics; sensors; and software and integration.
CERN researchers are involved in 19 of the projects, in areas from magnets and cryogenics to electronics and informatics. Several of the selected projects involve the design of sensors or signal-transmission systems that operate at very low temperatures or in the presence of radiation, and many target applications in medical imaging and treatment or in the aerospace sector. Others seek industrial applications, such as 3D printing of systems equipped with sensors, the inspection of operating cryostats or applications in environmental monitoring.
ESO’s astronomical technology and expertise will be applied to an imaging spectrograph suitable for clinical cancer studies and to single-photon visible-light imagers for adaptive optics systems and low-light-level spectroscopic and imaging applications. Among other projects connected with Europe’s major research infrastructures, four projects at the ESRF concern adaptive algebraic speckle tomography for clinical studies of osteoarticular diseases, a novel readout concept for 2D pixelated detectors, the transferral of indium-gallium-nitride epilayers onto substrates for full-spectrum LEDs, and artificial intelligence for the automatic segmentation of volumetric microtomography images.
“170 breakthrough ideas were selected based on a combination of scientific merit, innovation readiness and potential societal impact,” explained Sergio Bertolucci, chair of ATTRACT’s independent research, development and innovation committee. “The idea is to speed up the process of developing breakthrough technologies and applying them to address society’s key challenges.”
The outcomes of the ATTRACT seed-funding will be presented in Brussels in autumn 2020, and the most promising projects will receive further funding.
The open symposium of the European Strategy for Particle Physics (ESPP), which took place in Granada, Spain, from 13–16 May, revealed a vibrant field in flux as it grapples with how to attack the next big questions. Opening the event, chair of the ESPP strategy secretariat, Halina Abramowicz, remarked: “This is a very strange symposium. Normally we discuss results at conferences, but here we are discussing future results.” More than 10 different future-collider modes were under discussion, and the 130 or so talks and discussion sessions showed that elementary particle physics – in the wake of the discovery of the Higgs boson but so far no evidence of particles beyond the Standard Model (SM) – is transitioning into a new and less well-mapped realm of fundamental exploration.
Plain weird
Theorist Pilar Hernández of the University of Valencia described the SM as plain “weird”. The model’s success in describing elementary particles and their interactions is beyond doubt, but as an all-encompassing theory of nature it falls short. Why are the fermions arranged into three neat families? Why do neutrinos have an almost imperceptibly small mass? Why does the discovered Higgs boson fit the simplest “toy model” of itself? And what lies beneath the SM’s numerous free parameters? Similar puzzles persist about the universe at large: the mechanism of inflation; the matter–antimatter asymmetry; and the nature of dark energy and dark matter.
While initial results from the LHC severely constrain the most natural parameter spaces for new physics, said Hernández, the 10–100 TeV region is an interesting scale to explore. At the same time, she argued, there is a shift to more “bottom-up, rather than top-down”, approaches to beyond-SM (BSM) physics. The new quarries includes axion-like and long-lived particles, and searches for hidden, dark and feebly-interacting sectors – in addition to studying the Higgs boson, which has deep connections to many puzzles in the SM, with much greater precision. “Particle physics could be heading to crisis or revolution,” said Hernández.
Normally we discuss results at conferences, but here we are discussing future results
The accelerator, detector and computing technology needed for future fundamental exploration are varied and challenging. Reviewing Higgs-factory programmes, Vladimir Shiltsev, head of Fermilab’s Accelerator Physics Center, weighed up the pros and cons of linear versus circular machines. The former includes the International Linear Collider (ILC) and the Compact Linear Collider (CLIC); the latter a future circular electron–positron collider at CERN (FCCee) and the Circular Electron Positron Collider in China (CEPC). Linear colliders, said Shiltsev, are based on mature designs and organisation, are expandable to higher energies, and draw a wall-plug power similar to that of the LHC. On the other hand, they face challenges including their luminosity and number of interaction points. Circular Higgs factories offer a higher luminosity and more interaction points than linear options but require R&D into high-efficiency RF sources and superconducting cavities, said Shiltsev.
For hadron colliders, the three current options – CERN’s FCC-hh (100 TeV), China’s SppC (75 TeV) and a high-energy LHC (27 TeV) – demand next-generation superconducting dipole magnets. Akira Yamamoto of CERN/KEK said that while a lepton collider could begin construction in the next few years, the dipoles necessary for a hadron collider might take 10 to 15 years of R&D before construction could start.
The symposium also saw much discussion about muon colliders, which offer an energy-frontier lepton collider but for which it was widely acknowledged the technology is not yet ready. Concerning more futuristic acceleration technologies based on plasma wakefields, impressive results at facilities such as BELLA at Berkeley and AWAKE at CERN were on show.
Thinking ahead
From colliders to fixed-target to astrophysics experiments, said Francesco Forti of INFN and the University of Pisa, detectors face a huge variety of operating conditions and employ technologies deeply entwined with developments in industry. Another difficulty, he said, is how to handle non-standard physics signals, such as long-lived particles and monopoles. Like accelerators, detectors require long time scales – it was the very early 1990s when the first conceptual design reports for the LHC detectors were written.
In terms of data processing, the challenges ahead are immense, said Simone Campana of CERN and the HEP software foundation. The high-luminosity LHC (HL-LHC) presents a particular challenge, but DUNE, FAIR, BELLE II and other experiments will also create unprecedented data samples, plus there is the need to generate ever-more Monte Carlo samples. At the same time, noted Campana, the rate of advance in hardware performance has slowed in recent years, forcing the community to towards graphics processing units, high-performance computing and commercial cloud services. Forti and Campana both argued for better career opportunities and greater recognition for physicists who devote their time to detector and computing efforts.
The symposium also showed that the strategic importance of communications, education and outreach is becoming increasingly recognised.
Discussions in Granada revealed a community united in its desire for a post-LHC collider, but not in its choice of that collider’s form. Stimulating some heated exchanges, the ESPP saw proposals for future machines pitted against each other and against expectations from the HL-LHC in terms of their potential physics reach for key targets such as the Higgs boson.
Big questions
Gian Giudice, head of CERN’s Theory Department, said that the remaining BSM-physics space is “huge”, and pointed to four big questions for colliders: to what extent can we tell whether the Higgs is fundamental or composite? Are there new interactions or new particles around or above the electroweak scale? What cases of thermal relic WIMPs are still unprobed and can be fully covered by future collider searches? And to what extent can current or future accelerators probe feebly interacting sectors?
Though colliders dominated discussions, the enormous progress in neutrino physics since the previous ESPP was clear from numerous presentations. The open-symposium audience was reminded that neutrino masses, as established by neutrino oscillations, are the first particle-physics evidence for BSM phenomena. A vibrant programme is under way to fully measure the neutrino mixing matrix and in particular the neutrino mass ordering and CP violation phase, while other experiments are probing the neutrino’s absolute mass scale and testing whether they are of a Dirac or Majorana nature.
On 17 May in Granada, following the open symposium of the European Strategy for Particle Physics, the first meeting of a new international working group on the International Linear Collider (ILC) took place. The ILC is the most technologically mature of all current future-collider options, and was at the centre of discussions at the previous strategy update in 2013. Although its technology and costs have been revised since then, there is still no firm decision on the project’s location, governance or funding model. The new working group was set up by Japan’s KEK laboratory in response to a recent statement on the ILC from Japan’s Ministry of Education, Sports, Culture, Science and Technology (MEXT) that called for further discussions on these thorny issues. Comprising two members from Europe, two from North America and three from Asia (including Japan), the group will investigate and update several points, including: cost sharing for construction and operation; organisation and governance of the ILC; and the international sharing of the remaining technical preparations. The working group will submit a report to KEK by the end of September 2019 and the final report will be used by MEXT for discussions with other governments.
Around a fifth of the 160 input documents to the ESPP were linked to flavour physics, which is crucial for new-physics searches because it is potentially sensitive to effects at scales as high as 105 TeV, said Antonio Zoccoli of INFN. Summarising dark-matter and dark-sector physics, Shoji Asai of the University of Tokyo said that a shift was taking place from the old view, where dark-matter solutions arose as a byproduct of beyond-SM approaches such as supersymmetry, to a new paradigm where dark matter needs an explanation of its own. Asai called for more coordination and support between accelerator-based direct detection and indirect detection dark-sector searches, as exemplified by the new European Center for Astro-Particle Theory.
Jorgen D’Hondt of Vrije Universiteit Brussel listed the many dedicated experiments in the strong-physics arena and the open questions, including: how to reach an adequate precision of perturbative and non-perturbative QCD predictions at the highest energies? And how to probe the quark–gluon plasma equation of state and to establish whether there is a first-order phase transition at high baryon density?
Of all the scientific themes of the week, electroweak physics generated the liveliest discussions, especially concerning how well the Higgs boson’s couplings to fermions, gauge bosons and to itself can be probed at current and future colliders. Summary speaker Beate Heinemann of DESY cautioned that such quantitative estimates are extremely difficult to make, though a few things stand out. One is the impressive estimated performance from the HL-LHC in the next 15 years or so; another is that a long-term physics programme based on successive machines in a 100 km-circumference tunnel offers the largest overall physics reach on the Higgs boson and other key parameters. There is broad agreement, however, that the next major collider immediately after the LHC should collide electrons and positrons to fully explore the Higgs and make precision measurements of other electroweak parameters.
The big picture
The closer involvement of particle physics with astroparticle physics, in particular following the discovery of gravitational waves, was a running theme. It was argued that, in terms of technology, next-generation gravitational-wave detectors such as the Einstein Telescope are essentially “accelerators without beams” and that CERN’s expertise in vacuum and cryogenics would help to make such facilities a reality. Inputs from the astroparticle– and nuclear-physics communities, in addition to dedicated perspectives from Asia and the Americas, brought into sharp focus the global nature of modern high-energy physics and the need for greater coordination at all levels.
The open symposium of the ESPP update was a moment for physicists to take stock of the field’s status and future. The community rose to the occasion, aware that the decisions ahead will impact generations of physicists yet to be born. A week of high-quality presentations and focused discussions proved how far things have moved on since the previous strategy update concluded in 2013. Discussions illuminated both the immensity of efforts to evaluate the physics reach of the HL-LHC and future colliders, and the major task faced by the European Strategy Group (ESG) in plotting a path to the future. It is clear that new thinking, from basic theory to instrumentation, computing, analysis and global organisation, is required to sustain progress in the field.
No decisions were taken in Granada, stresses Abramowicz. “During the open symposium we mainly discussed the science. Now comes the time to assess the capacity of the community to realise the proposed scientific goals,” she says. “The Physics Preparatory Group is preparing the briefing book, which will summarise the scientific aspirations of the community, including the physics case for them.”
The briefing book is expected to be completed in September. The ESG drafting session will take place on 20–24 January 2020 in Bad Honnef, Germany, and the update of the ESPP is due to be completed and approved by CERN Council in May 2020.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.