Comsol -leaderboard other pages

Topics

ALICE studies possible light tetraquark

Radius parameters versus average transverse kaon-pair momentum determined from K0S – K± correlations and identical-kaon correlations in central ALICE lead–lead collisions.

 

The a0(980) resonance is formally classified by the Particle Data Group as a light diquark (quark + antiquark) meson similar to the pion. However, it has long been considered as a candidate tetraquark state made up of two quarks and two antiquarks. Existing experimental evidence based on the radiative decay of the φ meson has not been convincing, so the ALICE collaboration took a different approach to study the a0 by measuring K0S – K± correlations in lead–lead collisions at the LHC. Since the kaons are not identical there is no Hanbury–Brown–Twiss interferometry enhancement, and since the K0S is uncharged there is no Coulomb effect. Nevertheless, because the rest masses of the two kaons reach the threshold to produce the a0 it is expected that there is a strong final-state interaction between the two kaons through the a0 resonant channel.

 

Both the radii and the emission strength from the K0S – K± analysis agree with the identical kaon results, suggesting that the final-state interaction between the  K0S and K± goes solely through the a0 resonance without any competing non-resonant channels. A tetraquark a0 is expected to couple more strongly to the two kaons, since it has the same quark content, while the formation of a diquark state requires the annihilation of the strange quarks, which is suppressed due to geometric effects and a selection rule. Although there are no quantitative predictions for the magnitude of this suppression that would result for a diquark form of a0, the qualitative expectation is that this would open up non-resonant channels that would compete with the a0 final-state interaction, making it smaller than the identical-kaon values. The ALICE result of the final-state interaction going solely via the a0 thus favours the interpretation of the a0 as a tetraquark state.

First intermediate black-hole candidate

Since the prediction of black holes a century ago, numerous black-hole candidates have been found. These consist of both low-mass black holes, which have several times the mass of the Sun, and supermassive black holes (SMBHs), which are billions of times heavier. While many candidates exist for stellar-mass black holes and SMBHs, the latter being thought to occupy the centres of galaxies, candidates for black holes in the intermediate mass range were lacking.

A group of researchers from Keio University in Japan has now shown strong evidence for the existence of an intermediate-mass black hole (IMBH) within the Milky Way, which could shed light on the formation of black holes and of our galaxy.

While there is a consensus that stellar-mass black holes form when massive stars die, the source of SMBHs – one of which is thought to be at the centre of the Milky Way – is not well known. It is believed that large galaxies such as the Milky Way grew to their current size by cannibalising smaller dwarf galaxies containing IMBHs at their centres. Finding a candidate IMBH would provide evidence for this theory.

First, using the Nobeyama radio telescope, the team detected a gas cloud in the Milky Way with a peculiar velocity profile, hinting that an IMBH exists near the centre of our galaxy. This then prompted a more precise observation of the area using the Atacama Submillimeter Telescope Experiment (ASTE) and Atacama Large Millimeter/submillimeter Array (ALMA) in Chile. The cloud, named CO-0.40-0.22, was found to consist of one dense cloud in the centre with a large velocity profile, surrounded by 20 smaller clouds, the velocity profiles of which are aligned. Since the probability of these clouds being aligned by chance is less than one part in 108, it suggests there is some other object close to the cloud interacting with it. Within the gas cloud the data also revealed a point source emitting weak electromagnetic radiation at submillimetre wavelengths and none at higher wavelengths, ruling out a massive star cluster.

Based on these striking observations, the group simulated the gravitational interactions of the cluster and found that the measured velocity profiles are consistent with a gravitational kick by a dense object of 105 solar masses. Combined with the lack of high-energy emission and the spectrum of the object measured in radio wavelengths, the object matches all the characteristics of an IMBH – the first one ever observed. Two further IMBHs are now under study. The finding opens a new research avenue in understanding both massive and supermassive black holes, and strengthens the hypothesis that our galaxy grew by cannibalising smaller ones.

The physicist’s guide to the universe

Teasing out the intricate measurements that separate the smallest components of matter in the universe often involves monumental machines and huge international scientific collaborations. So it’s important that particle physicists are on the same page – or, rather, pages – when it comes to particle-physics results. For the past 60 years, the definitive collection of particle-physics evaluations and reviews has been bound up in a weighty print volume called the Review of Particle Physics, which is published every other year. The latest (2016) edition of what is sometimes referred to as the “bible of particle physics” contains 1808 pages in the complete version published online, and features 117 review articles on topics ranging from the Higgs boson to the Big Bang, statistics and particle detectors. Its Particle Listings include evaluations of 3062 new measurements from 721 papers, in addition to 35,436 measurements from 9843 papers published in earlier editions. The staff behind it carefully evaluate data on around 8000 different quantities to provide averages, fits and best limits.

The Review is the all-time most highly cited publication in particle physics, with recent editions eventually reaching more than 6000 citations. It also has a companion 344 page booklet that is the descendant of “wallet cards” first issued in 1957, with a summary of particle data from the main Review. The PDG website (pdg.lbl.gov) features the complete content of the book as both PDF files and in an interactive version, with downloadable figures and tables, as well as educational materials. The Review continues to grow as we learn more about the basic constituents of matter, and its history reflects a field that is continuously evolving.

Berkeley beginnings

The Review of Particle Physics and its associated publications and website are the products of the international Particle Data Group (PDG), which since its beginnings has been headquartered at the University of California Radiation Laboratory, now the Lawrence Berkeley National Laboratory (Berkeley Lab), in California. More than 200 authors around the globe currently contribute to the contents of the Review, including 3.5 full-time-equivalent physicists in the PDG group at Berkeley Lab who also co-ordinate the effort.

The story began towards the end of 1957 with a paper in the Annual Review of Nuclear Science authored by the late Arthur “Art” Rosenfeld and Murray Gell-Mann. The tables of particle masses and lifetimes associated with that article, which Rosenfeld prepared with Walter Barkas in an unpublished report, “Data for Elementary-Particle Physics,” are credited with PDG’s inception. “The damn thing just grew,” Rosenfeld said of the wallet-card summary of that first report, which now fills a spiral-bound booklet.

Rosenfeld said in 1975 that the motivation for the original 1957 report was to provide particle data for early computer programs that were used to process the data from new particle-physics experiments, including bubble-chamber experiments. The following year the report was revised. The report was next revised in 1961, and during these first few years of the Review Rosenfeld and his colleagues intermittently distributed updates of the report to the particle-physics community, along with the updated wallet cards.

New discoveries in the field led to a growing need for particle-data resources, and Rosenfeld was clear that the 1963 edition should be the last attempted without the help of a computer. A separate effort by Finnish physicist Matts Roos called “Tables of Elementary Particles and Resonant States” also illustrated that it was no longer possible for a single person to compile data critically, reckoned Rosenfeld. So the two separate efforts joined forces, with five Berkeley authors and Roos publishing “Data on Elementary Particles and Resonant States” in 1964. This article, which appeared in the Reviews of Modern Physics journal, comprised 27 pages plus three wallet cards.

The group branded itself as the Particle Data Group in 1968 and published its first data booklet that year. By 1974 the report, by then called Review of Particle Properties, had grown to 200 pages and had 13 authors, several of them based in Europe. An escalation of discoveries in the field during the mid-1970s provided the cornerstones for the Standard Model of particle physics, which described the family of known and theorised particles and their properties. The heavy crush of particle data flowing into PDG during this period led the staff to implement a new media format for distributing some data and additional quality-control measures. In 1973 a microfiche with references and backup material was included in an envelope at the back of the book.

Since then, the population of particle physicists worldwide has exploded and the print version of the Review and related booklets are currently distributed to thousands of physicists. INSPIRE, an information system that tracks published materials and experiments in the field of high-energy physics, now counts more than 1100 active experiments in the field, compared to about 300 in 1975, and the number of particle physicists has also increased from about 7000 in 1975 to an estimated 20,000 today. The print book was getting so big – growing at a rate of about 10% per year – that the PDG dropped its Listings from the print edition in 2016.

“We would have had to print two volumes if we continued to include the Listings. Given that there is likely no single person who wants to read through a major fraction of the Listings, this wasn’t justified,” recalls Juerg Beringer of Berkeley Lab, who became leader of the PDG in 2016. “Looking up data from the Listings online is anyway far more convenient.” The Listings are still available on the PDG website and included in the online journal publication.

Review articles

Many sections in the Listings are accompanied by review articles that provide further information on the data presented. Other review articles summarise major topics in particle physics or cosmology. Review articles can vary from about a page to tens of pages in length, and roughly two-thirds of review articles require updates in each edition. The first PDG review article on the Higgs boson, which appeared in 1988, was two pages long. Today, five years after the Higgs was discovered, the review is about 50 pages long and is the most viewed review on the website, with more than 50,000 downloads each year. PDG even delayed publication in 2012 to accommodate the Higgs boson’s discovery, with staff scrambling to add the Higgs addendum to the Review within eight days of the discovery’s announcement on 4 July 2012.

The scope and importance of PDG has grown substantially, especially during the past 30 years (figure 1). While the size of the PDG group at Berkeley Lab has remained essentially the same, a large number of physicists worldwide were recruited to keep up with the flood of publications in particle physics and write the many PDG review articles that now cover almost every aspect of particle physics. There are now 223 authors who contribute to the review articles or Listings and each will typically write a single review article or handle one Listings section. Collaborators outside Berkeley Lab are volunteers who usually spend only a small fraction of their time on the Review, while PDG group members at Berkeley Lab typically spend half of their time working on the PDG (see image). There is also a European-based PDG “meson team” of about a dozen members, which holds meetings twice a year at CERN, while another PDG sub-group called the baryon team is responsible for the data on baryon resonances.

Michael Barnett, a Berkeley Lab physicist and previous head of PDG having been in the role for 25 years, recalled his first experience working with the Review production when he joined Berkeley Lab in 1984. “It was barely 300 pages long and still put together by hand,” Barnett said. “We used 20 rolls of Scotch Tape to stick pieces together for the camera-ready copy. The section on B mesons was a single page. These days the B-meson section alone is over 120 pages.” In earlier days, the data for the publications were stored on computer punch cards. The print data back then appeared as only uppercase letters, with no mathematical symbols, because the punch cards couldn’t accommodate them. Under Barnett’s watch the design and layout became more reader-friendly. Particle categories multiplied, with properties listed in detail. Many new reviews were added to help explain the content of Listings sections.

Computing era

In the late 1980s a then-modern computing system was developed that served the PDG well for two decades. But a major upgrade eventually became inevitable, and the COMPAS group from the Institute of High Energy Physics in Protvino, Russia, which had been a PDG collaborator for many years, began working on prototypes for a new computing system. Working with COMPAS and experts from Berkeley Lab’s Computational Research Division, Beringer led the development of a new web-based computing platform that was supported by a special grant from the US Department of Energy (DOE). As a result, each collaborator can now directly add data to the PDG database rather than channelling it all through the PDG editor. This platform has made Review updates far more manageable. “The new system allows collaborators to see changes immediately, without waiting for the editor to go through thousands of e-mails with instructions on what to change,” says Piotr Zyla, who succeeded Betty Armstrong as PDG editor in 2003.

As with any large-scale, data-intensive publishing endeavour, there have been a few notable glitches. The 1994 booklet had a ruler with centimetre marks that were shrunk by the publisher so that each centimetre was actually 0.97 of a centimetre. The error was discovered too late to fix, but not too late to insert a disclaimer citing fictitious and comical explanations for why the centimetres fell a bit short: “The PDG feels it has the right to redefine anything it wants”; “The booklets were returned from the printer at 0.25 times the speed of light”; and “A theorist is in charge of the PDG.”

Barnett and his colleagues had considered publishing the Review on the internet since the early days of the World Wide Web – which, of course, was created at CERN in 1989 to more easily share research data around the world. The entire contents of the Review were available on the web in 1995 and its interactive version, pdgLive, appeared with the 2006 edition. An increasingly sophisticated PDG web presence has been influenced by membership surveys asking readers whether in the digital age a printed book is essential, useful, or altogether unnecessary. The first survey, in 2000, got about 2450 responses, half of which found the print version useful and well over a third found it essential. By 2014 the number of responses had tripled. While there was a clear trend in favour of online publications, many respondents still emphasised the importance of the printed book. As one respondent stated in the 2000 survey, “I could live without my right arm, but I don’t want to.”

“We expected older physicists to be the ones who valued the book and the younger ones, who grew up with the internet, not to care,” says Barnett. “We got it backwards. Everybody used the web, but more grad students and postdocs found the printed book essential.” Their comments told why: to those entering physics, the book was not merely a reference but an introduction to the unexplored dimensions of their field. The distribution scheme for the print publications has become fairly sophisticated to minimise shipping costs. There are now four separate distribution channels: in Switzerland, Japan, China and the US. Receiving the print materials is not automatic and recipients must specifically request each new edition. The audience is largely physicists, teachers, students and physics fans, with most mailings going out to high-energy physics centres and academic institutions.

The bulk of the funding for PDG comes from the Office of Science of the DOE and supports the co-ordination and production activities at Berkeley Lab. Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) contributes to these efforts via a US–Japan agreement on co-operative research and development. CERN supports the meson team, and in recent years CERN and the Institute of High Energy Physics of the Chinese Academy of Sciences have paid for most of the printing and shipping costs of books and booklets. Funding agencies in multiple countries, including INFN in Italy, MINECO in Spain, and IHEP in Russia provide travel and other support to PDG collaborators in their countries.

Until recently, the PDG group at Berkeley Lab was able to handle most of the PDG co-ordination tasks. But with the growth of PDG in recent years, combined with a challenging funding environment, even this has become increasingly difficult. Thankfully, INFN recently agreed to help Berkeley Lab in this area. A recent effort to streamline and automate many aspects of PDG’s operations is also providing necessary relief.

Contributing knowledge

The published results collected in the Listings provide best values and limits for a wide range of particle properties. The data can also be used to study how knowledge in particle physics evolves, for example by plotting the evolution of PDG best values over time (figure 2 and table).

Over the decades there have been occasional disputes and discrepancies for PDG staff to resolve. In one instance, discussions escalated to a threatened lawsuit over PDG’s refusal to include one researcher’s particle data in the Review’s Summary Tables. There was also a case in which the experimental measurements of the mass squared of one particle (the electron neutrino) appeared as a negative number in the data: since this is mathematically impossible, the PDG editors adjusted the error margins to account for the problem. Another unusual episode concerned claims of discoveries of pentaquarks about a decade ago and later experiments discounting those earlier claims. These ups and downs, including the latest measurements from the LHCb experiment, were covered in the reviews to keep readers up to date.

When various data are in substantial conflict, PDG sets error bars that range across the whole span of results, or, in some cases, provides no average at all. Also, about 20 years ago, the PDG instituted a new naming scheme that more logically renamed many particles. All of them stuck except for one – there was an international campaign against that name-change, so the PDG staff deferred in this one instance.

Only a subset of data collected by PDG is available in a downloadable format suitable for further processing. There is a demand for such access from researchers running Monte Carlo programs and others who want to, for example, investigate the statistical properties of the agreement between multiple measurements of the same quantity. “Making all PDG data available in a machine-readable format is a very high priority. We’ve wanted to do this for a long time as there are many uses and a lot of interest from the community. But we can barely keep up with the ongoing updates of the Review, and so the implementation of such new features takes much more time than we would like,” Beringer says.

Rosenfeld, in the conclusion of his 1975 paper assessing the work of PDG, noted challenges even then in supporting the data needs of the scientific community: “As we write this review we wonder if we have not been too modest in our requests for support…we feel that PDG is doing an effective job, but if we could spend, each year, one-fifth of the typical experiment [in those days the typical experiment cost about $3 million], it could provide broader and more timely services.”

The gradual transition from print to primarily online distribution is expected to continue, in line with the overall shift of publishers toward online publication but also, in part, because of the high cost of printing and mailing the books. Nevertheless, as long as there is continuing demand and adequate resources, PDG hopes to continue the printed book. “Producing and updating the Review of Particle Physics in modern formats will remain PDG’s core mission,” says Beringer.

Online access to PDG’s increasingly mobile-friendly web pages, or a PDG smartphone app with the complete contents of the Review in a mobile-friendly format, could in principle replace the PDG booklet. But especially for students, the PDG book and booklet  also carry substantial symbolic value, and the booklets are often distributed in introductory particle-physics classes. “It is a landmark thing for some of the graduate students and postdocs,” Barnett remarks. “When you get your first book, and when you see your data appearing, you feel like you are a particle physicist.”

Birth of a symmetry

Weinberg’s paper “A Model of Leptons”, published in Physical Review Letters (PRL) on 20 November 1967, determined the direction of high-energy particle physics through the final decades of the 20th century. Just two and a half pages long, it is one of the most highly cited papers in the history of theoretical physics. Its contents are the core of the Standard Model of particles physics, now almost half a century old and still passing every experimental test.

Most particle physicists today have grown up with the Standard Model’s orderly account of the fundamental particles and interactions, but things were very different in the 1960s. Quantum electrodynamics (QED) had been well established as the description of the electromagnetic interaction, but there were no mature theories of the strong and weak nuclear forces. By the 1960s, experimental discoveries showed that the weak force exhibits some common features with QED, in particular that it might be mediated by a vector boson analogous to the photon. Theoretical arguments also suggested that QED’s underlying “U(1)” group structure could be generalised to the larger group SU(2), but there was a serious problem with such a scheme: the W boson suspected to mediate the weak force would have to be very massive empirically, whereas the mathematical symmetry of the theory required it to be massless like the photon.

The importance of symmetries in understanding the fundamental forces was already becoming clear at the time, in particular how nature might hide its symmetries. Could “hidden symmetry” lead to a massive W boson while preserving the mathematical consistency of the theory? It was arguably Weinberg’s developments, in 1967, that brought this concept to life.

Strong inspiration

Weinberg’s inspiration was an earlier idea of Nambu in which fermions – such as the proton or neutron – can behave like a left- or right-handed screw as they move. If mass is ignored, these two “chiral” states act independently and the theory leads to the existence of a particle with properties similar to those of the pion – specifically a pseudoscalar, which means that it has no spin and its wavefunction changes sign under mirror symmetry. Nambu’s original investigations, however, had not examined how the three versions of the pion, with positive, negative or zero charge, shared their common “pion-ness” when interacting with one another. This commonality, or symmetry, is mathematically expressed by the group SU(2), which had been known in nuclear physics since the 1930s and in mathematics for much longer.

It was this symmetry that Weinberg used as his point of departure in building a theory of the strong force, where nucleons interact with pions of all charges and the proton and neutron themselves form two “faces” of the underlying SU(2) structure. Empirical observations of the interactions between pions and nucleons showed that the underlying symmetry of SU(2) tended to act on the left- or right-handed chiral possibilities independently. The mathematical structure of the resulting equations to describe this behaviour, as Weinberg discovered, is called SU(2)×SU(2).

However, in nature this symmetry is not perfect because nucleons have mass. Had they been massless, they would have travelled at the speed of light, the left- and right-handed possibilities acting truly independently of one another and the symmetry left intact. That nucleons have a mass, so that the left and right states get mixed up when perceived by observers in different inertial frames, breaks the chiral symmetry. Nambu had investigated this effect as far back as 1959, but without the added richness of the SU(2)×SU(2) mathematical structure that Weinberg brought to the problem. Weinberg had been investigating this more sophisticated theory in around 1965, initially with considerable success. He derived theorems that explained the observed interactions of pions and nucleons at low energies, such as in nuclear physics. He was able to predict how pions behaved when they scattered from one another and, with a few well-defined assumptions, paved the way for a whole theory of hadronic physics at low energies.

Meanwhile, in 1964, Brout and Englert, Higgs, Kibble, Guralnik and Hagen had demonstrated that the vector bosons of a Yang–Mills theory (one that is like QED but where attributes such as electric charge can be exchanged by the vector bosons themselves) put forward a decade earlier could become massive without spoiling the fundamental gauge symmetry. This “mass-generating mechanism” suggested that a complete Yang–Mills theory of the strong interaction might be possible. In addition to the well-known pion, examples of massive vector particles that feel the strong force had already been found, notably the rho-meson. Like the pion, this too occurs in three charged varieties: positive, negative and zero. Superficially these rho-mesons had the hallmarks of being the gauge bosons of the strong interactions, but they also have mass. Was the strong interaction the theatre for applying the mass-generating mechanism?

Despite at first seeming so promising, the idea failed to fit the data. For some phenomena, the SU(2)×SU(2) symmetry empirically is broken, but for others where spin didn’t matter it works perfectly. When these patterns were incorporated into the maths, the rho-meson stubbornly remained massless, contrary to reality.

Epiphany on the road

In the middle of September 1967, while driving his red Camaro to work at MIT, Weinberg realised that he had been applying the right ideas to the wrong problem. Instead of the strong interactions, for which the SU(2)×SU(2) idea refused to work, the massless photon and the hypothetical massive W boson of the electromagnetic and weak interactions fitted perfectly with this picture. To call this possibility “hypothetical” hardly does justice to the time: the W boson was not discovered until 1984, and in 1967 was so disregarded as to receive at best a passing mention, if any, in textbooks.

Weinberg needed a concrete model to illustrate his general idea. The numerous strongly interacting hadrons that had been discovered in the 1950s and 1960s were, for him, a quagmire, so he restricted his attention to the electron and neutrino. Here too it is worth recalling the state of knowledge at the time. The constituent quark model with three flavours – up, down and strange – had been formulated in 1964, but was widely disregarded. The experiments at SLAC that would help establish these constituents were a year away from announcing their results, and Bjorken’s ideas of a quark model, articulated at conferences that summer, were not yet widely accepted either. Finally, with only three flavours of quark, Weinberg’s ideas would lead to empirically unwanted “strangeness-changing neutral currents”. All these problems would eventually be solved, but in 1967 Weinberg made a wise choice to focus on leptons and leave quarks well alone.

Following the discovery of parity violation in the 1950s, it was clear that the electron can spin like a left- or right-handed screw, whereas the massless neutrino is only left-handed. The left–right symmetry, which had been a feature of the strong interaction, was gone. Instead of two SU(2), the mathematics now only needed one, the second being replaced by the unitary group U(1). So Weinberg set up the equations of SU(2)×U(1) – the same structure that, unknown to him, had been proposed by Sheldon Glashow in 1961 and by Abdus Salam and John Ward in 1964 in attempts to marry the electromagnetic and weak interactions. His theory, like theirs, required two massive electrically charged bosons – the W+ and W carriers of the weak force – and two neutral bosons: the massless photon and a massive Z0. If correct, it would show that the electromagnetic and weak forces are unified, taking physics a step closer to the goal of a single theory of all fundamental interactions.

“The history of attempts to unify weak and electromagnetic interactions is very long, and will not be reviewed here.” So began the first footnote in Steven Weinberg’s seminal November 1967 paper, which led to him being awarded the 1979 Nobel Prize in Physics with Salam and Glashow. Weinberg’s footnote mentioned Fermi’s primitive idea for unification in 1934, and also the model that Glashow proposed in 1961.

Clarity of thought

Weinberg started his paper by articulating the challenge of unifying the electroweak forces as both an opportunity and a threat. He focused on the leptons – those fermions, such as the electron and neutrino, which do not feel the strong force. “Leptons interact only with photons, and with the [weak] bosons that presumably mediate weak interactions. What could be more natural than to unite these spin-one bosons [the photon and the weak bosons] into a multiplet,” he pondered. That was the opportunity. The threat was that “standing in the way of this synthesis are the obvious differences in the masses of the photon and [weak] boson.”

Weinberg then suggests a solution: perhaps “the symmetries relating the weak and electromagnetic interactions are exact [at a fundamental level] but are [hidden in practice]”. He then draws attention to the ideas of Higgs, Brout, Englert, Guralnik, Hagen and Kibble, and uses these to give masses to the W and Z in his model. In a further important insight, Weinberg shows how this symmetry-breaking mechanism leaves the photon massless.

His opening paragraph ended with the prescient observation that: “The model may be renormalisable.” The argument upon which this remark is based appears at the very end of the paper, although with somewhat less confidence than the promise hinted at the beginning. He begins the final paragraph with a question: “Is this model renormalisable?” The extent of his intuition is revealed in his argument: although the presence of a massive vector boson hitherto had been a scourge, the theory with which he had begun had no such mass and, as such, was “probably renormalisable”. So, he pondered: “The question is whether this renormalisablity is lost [by the spontaneous breaking of the symmetry].” And the conclusion: “If this model is renormalisable, what happens when we extend it…to the hadrons?”

By speculating that his model may be renormalisable, Weinberg was hugely prescient, as ’t Hooft and Veltman would prove four years later. And perhaps it was a chance encounter at the Solvay Congress in Belgium two weeks before his paper was submitted that helped convince Weinberg that he was on the right track.

Solvay secrets

By the end of September 1967, Weinberg had his ideas in place as he set off to Belgium to attend the 14th Solvay Congress on Fundamental Problems in Elementary Particle Physics, held in Brussels from 2 to 7 October. He did not speak about his forthcoming paper, but did make some remarks after other talks, in particular following a presentation by Hans Peter Durr about a theorem of Jeffrey Goldstone and spontaneous symmetry breaking. During a general discussion session following Durr’s talk, Weinberg mused: “This raises a question I can’t answer: are such models renormalisable?” He continued with a similar argument to that which later appeared in his paper, ending with: “I hope someone will be able to find out whether or not [this] is a renormalisable theory of weak and electromagnetic interactions.”

There was remarkably little reaction to Weinberg’s remarks, and he himself has recalled “a general lack of interest”. The only recorded statement came from François Englert, who insisted that the theory is renormalisable; then, remarkably, there is no further discussion. Englert and Robert Brout, then relatively junior scientists, had both attended the same Brussels meeting.

At some point during the Solvay conference, Weinberg presented a hand-written draft of his paper to Durr, and 40 years later I obtained a copy by a roundabout route. Weinberg himself had not seen it in all that time, and thought that all record of his Nobel-winning manuscript had been lost. The original manuscript is notable for there being no sign of second thoughts, or editing, which suggests that it was a provisional final draft of an idea that had been worked through in the preceding days. The only hint of modification after the first draft had been written is a memo squeezed in at the end of a reference to Higgs, to include references to Brout and Englert, and to Guralnik, Hagen and Kibble, for the idea of spontaneous symmetry breaking, on which the paper was based. Weinberg’s intuition about the renormalisability of the model is already present in this manuscript, and is identical to what appears in his PRL paper. There is no mention of Glashow’s SU(2)×U(1) model in the draft, but this is included in the version that was published in PRL the following month. This is the only substantial difference. This manuscript was submitted to the editors of PRL on Weinberg’s return to the US, and received by them on 17 October. It appeared in print on 20 November.

Lasting impact

Weinberg’s genius was to assemble together the various pieces of a jigsaw and display the whole picture. The basic idea of mass generation was due to the assorted theorists mentioned above, in the summer of 1964. However, a crucial feature of Weinberg’s model was the trick of being able to give masses to the W and Z while leaving the photon massless. This extension of the mass-generating mechanism was due to Tom Kibble, in 1967, which Weinberg recognises and credits.

As was the case with his comments in Brussels the previous month, Weinberg’s paper appeared in November 1967 to a deafening silence. “Rarely has so great an accomplishment been so widely ignored,” wrote Sidney Coleman in Science in 1979. Today, Weinberg’s paper has been cited more than 10,000 times. Having been cited but twice in the four years from 1967 to 1971, suddenly it became so important that researchers have cited it three times every week throughout half a century. There is no parallel for this in the history of particle physics. The reason is that in 1971 an event took place that has defined the direction of the field ever since: Gerard ’t Hooft made his debut, and he and Martinus Veltman demonstrated the renormalisability of spontaneously broken Yang–Mills theories. A decade later the W and Z bosons were discovered by experiments at CERN’s Super Proton Synchrotron. A further 30 years were to pass before the discovery of the Higgs boson at the Large Hadron Collider completed the electroweak menu. And in the meantime, completing the Standard Model, quantum chromodynamics was established as the theory of the strong interactions, based on the group SU(3).

This episode in particle physics is not only one of the seminal breakthroughs in our understanding of the physical world, but touches on the profound link between mathematics and nature. On one hand it shows how it is easier to be Beethoven or Shakespeare than to be Steven Weinberg: change a few notes in a symphony or a phrase in a play, and you can still have a wonderful work of art; change a few symbols in Weinberg’s equations and the edifice falls apart – for if nature does not read your creation, however beautiful it might be, its use for science is diminished. Like all great theorists, Weinberg revealed a new aspect of reality by writing symbols on a sheet of paper and manipulating them according to the logic of mathematics. It took decades of technological progress to enable the discoveries of W and Higgs bosons and other entities that were already “known” to mathematics 50  years ago.

• This article draws on material from Frank Close’s history of the path to discovery of the Higgs boson: The Infinity Puzzle (Oxford University Press).

Symmetries, groups and massive insight led to electroweak unification

Weinberg’s 1967 achievement is rooted in the notation of group theory, which is the mathematical language describing the symmetries of a system, and built upon many earlier successes including that of quantum electrodynamics (QED). QED is perhaps the simplest example of a general class of “non-abelian gauge theories”. Since the all-important electric charge in QED is a single number, it can be described mathematically in terms of the first unitary group, U(1). In the 1950s Yang and Mills constructed generalisations of QED in which the U(1) number was replaced by matrices, such as in the groups SU(2) or SU(3). The weak force exhibited tantalising hints that a SU(2) generalisation of QED might be involved, but there was a serious problem: a “W boson” – the analogue of QED’s photon – would have to be very massive empirically, whereas the mathematical symmetry of the theory required it to be massless – like the photon. The only way to give the W and Z particles mass yet leave the photon massless was if nature contained “hidden symmetries” that were somehow broken.

In 1961 Goldstone discovered a theorem suggesting, inter alia, that a theory of the weak force involving hidden symmetry is impossible. However, in 1963, condensed-matter theorist Philip Anderson pointed out that superconductivity manages to evade Goldstone’s theorem, and demonstrated this mathematically in a theory without relativity. The following year several theorists, including Peter Higgs, generalised Anderson’s insights to include relativity. Among the implications were that a theory involving fermions with no mass – with so-called chiral symmetry – could hide this property in empirically consistent ways when particles become massive; that there should be a massive boson without spin (the Higgs boson); and that the W boson could also gain mass while preserving the underlying mathematical symmetry of the theory. It was Weinberg’s 1967 paper that brought all of these pieces together, and today we know that nature follows this path, with the weak and electromagnetic interactions described by a single SU(2)×U(1) structure. At the time, however, the breakthrough was hardly noticed.

Model physicist

Steven Weinberg was 34 when he produced his iconic “Model of Leptons”. The paper marked a moment of clarity in the history of particle physics and gave rise to the electroweak Standard Model, but it was also exceptional in inspiring one of the biggest experimental programmes science has ever seen. Flushing out and measuring its predicted W, Z and Higgs bosons took a multi-billion Swiss-franc effort in Europe that spanned four major projects – Gargamelle, the SPS, LEP and the LHC – and defined CERN’s research programme, keeping experimentalists in gainful employment for at least four decades. Not bad for a theory that, as Weinberg wrote at the time, “has too many arbitrary features for [its] predictions to be taken very seriously”.

Needless to say, Weinberg is delighted to have been able to witness the validation of the Standard Model (SM) over the decades. “I mean, it’s what keeps you going as a theoretical physicist to hope that one of your squiggles will turn out to describe reality,” he says. “I wouldn’t have been surprised or even very chagrined that, although the general idea was right, this particular model didn’t describe nature.”

Today, 50 years after his 1967 insight, Weinberg protests the notion that he is retired. The US has laws against discrimination on the basis of age, he says dryly. “I tell the people here that I plan to retire shortly after I die.” He is currently teaching a course in astrophysics at the University of Texas at Austin, his base for the past 35 years, and has two books and a new cosmology paper in the pipeline. Weinberg spoke to the Courier by phone in September from his home, reflecting on the state of high-energy physics following the Higgs boson discovery and on where the best hopes for new physics might lie. He began by recounting the thought processes that led him to his seminal 1967 work – many of which took place in children’s playgrounds.

Park-bench physics

“It was a complicated time of my life because my family had moved to Cambridge from Berkeley while my wife was studying at Harvard for her law degree. I had all the responsibility of taking care of our four-year old daughter, including taking her to nursery school, and a lot of my thinking was done while sitting on park benches and watching my daughter play,” he says.

Weinberg did not set out to unify the forces. He had been applying his ideas about symmetries, specifically the structure SU(2)×SU(2), to the strong interaction but it implied that the rho meson would be massless, contrary to experiment. “When I had the idea that the massless rho meson might really be the photon, it became natural to me that the rest of this gauge theory, suitably modified, could only be a theory of weak forces.” The work went quickly once he realised what he was doing. “You take the left-handed electron plus neutrino doublet and right-handed electron and ask what the most general possible symmetry group is, which turns out to be SU(2)×U(1)×U(1). Then you throw one U(1) away because if that was an unbroken gauge symmetry, you would have long-range forces among electrons which you don’t observe. So you’re led almost inevitably towards SU(2)×U(1). Indeed, though at first I didn’t know it, the same group had been used in a different way earlier by Glashow and by Salam and Ward.”

The paper was published without fanfare in Physical Review Letters on 20 November within a month of its submission. Weinberg doesn’t recall any talk he gave before the publication. He mentioned it in a side comment at the Solvay conference around that time (see “Birth of a symmetry”), but it didn’t arouse tremendous excitement.

In many ways, the paper was uncharacteristic of Weinberg. His tendency was to write general papers without worrying too much about their specific realisation in nature, he says, but in this one he was more specific. “Of course, experimentalists don’t test general ideas, so I was delighted when they showed that my theory was the right one. And then, after the neutral-currents discovery, the W and Z were discovered directly at CERN 10 years later and measured in detail at LEP and SLAC.”

It was not the specific model of leptons that excited him, though. Since his graduate days at Princeton, Weinberg has been hooked on the possibility of having a deductive basis for a physical theory following from the principles of symmetry and, in particular, renormalisability. “Symmetry is not enough by itself. In electromagnetism, for example, if you write down all the symmetries we know, such as Lorentz invariance and gauge invariance, you don’t get a unique theory that predicts the magnetic moment of the electron. The only way to do that is to add the principle of renormalisability – which dictates a high degree of simplicity in the theory and excludes these additional terms that would have changed the magnetic moment of the electron from the value Schwinger calculated in 1948.”

Renormalisability was the technique that connected quantum field theory to reality, offering a scientifically sound way to deal with the infinities that arise in calculations. Back when his paper was published, however, Weinberg did not know if his theory was renormalisable. That was probably why nobody took much notice of it, he says. “Remember: when we’re talking about renormalisability, it’s not just something theorists do to get rid of infinities. It was a criterion that was the sort you look for in theoretical physics, that defines a certain type of simplicity in your theories which otherwise would be arbitrary. We’re talking about whether we can have a theory of the weak interaction in which we can calculate beyond the lowest order in perturbation theory.” Weinberg strongly suspected his might be such a theory because before spontaneous symmetry breaking is taken into account, it has the same form as QED and it had already been proved that non-spontaneously broken Yang–Mills theories were renormalisable. “Salam and I weren’t sure but we didn’t think spontaneous symmetry breaking would affect the renormalisability because if you go to very high energies (much larger than the W or Z mass), the fact the symmetry is broken is no longer significant.” Had he not thought the model renormalisable, he might not have published. “The prospect of issuing an erratum was too much!”

In 1971, Weinberg began to realise that his paper was “hot stuff”, following the critical breakthrough by ’t Hooft and Veltman proving that the theory was renormalisable, although whether or not his particular model of leptons was correct was a still matter for experiment. The same year, he also tried to extend his ideas to the strong interactions using the quark model, in which there was little confidence at the time. The reality of quarks became clear with the discovery by Gross and Wilczek and Politzer in 1973 of the asymptotic freedom of some gauge theories, and the subsequent development of quantum chromodynamics, to which Weinberg, along with Gross and Wilczek, contributed the idea that it is impossible to isolate coloured particles such as quarks and gluons.

“It’s funny you know, people look back at the 1970s as one of the most miserable peacetime decades in the 20th century, as there was lots of unemployment and high inflation, in the US at least,” he muses. “But for us physicists it was a great time: everything was coming together, and experimentalists and theorists were talking to each other in lots of ways. Things are much harder today.”

The Higgs nightmare

The discovery of the Higgs boson by the ATLAS and CMS experiments at CERN five years ago was the capstone in Weinberg’s model. Until then, no one knew for sure how the electroweak symmetry gets broken to give elementary particles their masses – it was still possible that the Higgs mechanism was correct but that it does not involve a Higgs boson, for instance, or that the “Higgs” is a composite particle bound by new strong forces that lead to a dynamical breakdown of the symmetry. Precisely such a model, called technicolour, was proposed by Weinberg and Susskind in 1979. Back in 1967, though, Weinberg took the simplest possibility: a doublet of scalars. It was the only kind of elementary field that could not only give mass to the W and the Z but also the electron, he reasoned, and it would lead to the necessary existence of a leftover scalar particle that was not eliminated by the Higgs mechanism and became known as the Higgs boson. “The discovery of the Higgs boson was very important because it confirmed the very simple early picture of spontaneous symmetry breaking, which we couldn’t have known was correct because there were alternatives,” he says.

So did Weinberg’s 1967 paper also predict the Higgs boson? “It depends, as Bill Clinton might say, what is meant by the word ‘the’,” he laughs. “Is it the Higgs boson? Well, the existence of these particles in the general class of spontaneously broken gauge theories was predicted by Higgs and so on, and if you included theories that are non-gauge invariant then even earlier by Goldstone. But if by ‘the’ you mean the particle discovered, then that was predicted in my paper. The first paper that made a specific prediction of a single neutral particle whose coupling to leptons and later also to quarks was proportional to their masses was the 1967 model of leptons. The others also had a scalar particle but they were not developing a theory of weak interactions, they were considering several classes of theories with leftover scalars with unknown properties.”

Weinberg thinks the Nobel-prize committee made an “excellent job” in deciding who would share in the 2013 prize for the discovery (François Englert and Peter Higgs). “It isn’t a prize for predicting the Higgs boson, it is a prize for the theoretical discovery of the Higgs mechanism, which is exactly right because that is what was proposed by those 1964 papers. I rediscovered it in 1967 because I was working on spontaneously broken SU(2)×SU(2) gauge theory for the strong interaction, but I take no credit for it because it was already in the literature for three years. The actual Higgs boson, I think that was an experimental achievement.

Unlike many particle physicists on the day of the Higgs announcement on 4 July 2012, Weinberg doesn’t recall exactly what he was doing when he heard the news. What he is sure of is that we are entering what he described several years ago as the “nightmare scenario” of having found a SM Higgs boson and nothing else. He says we’ve gotten ourselves into a rather unfortunate situation because the SM describes all the physics that can be addressed experimentally except things outside the SM like gravity and the neutrino masses. “It’s nobody’s fault. It is not an intellectual failure. It’s just a fix we’ve got into.” He doesn’t hold out too much hope in mainstream theoretical arguments for the existence of physics beyond the SM at the energies currently being probed at the LHC – i.e. that new heavy particles must exist to cancel out quantum contributions to the Higgs mass that would cause it to spiral to infinity. The fact that we now know that an elementary Higgs scalar exists makes this “hierarchy problem” somewhat harder, Weinberg concedes, but he points out that we’ve been living with the problem already for 40 years. So far the LHC has not found evidence for physics beyond the SM, including the most popular solution to shield the Higgs from getting additional mass: supersymmetry (SUSY). “Worse, there isn’t any one completely satisfactory SUSY model. Every SUSY model has things in it that are troublesome,” says Weinberg.

He thinks we might have to find other explanations for this and other absurdly fine-tuned parameters in the universe, such as the very small value of the vacuum energy or cosmological constant, or even abandon traditional explanations altogether.

“No one has come up with a plausible suggestion there except for the somewhat desperate suggestion that it is anthropic – that you have a multiverse and by accident there are occasional sub-universes where the vacuum energy is small and it’s only those in which galaxies can form – and people have suggested similar anthropic arguments for the smallness of the Higgs mass and the quark-mass hierarchy,” says Weinberg, who himself used anthropic reasoning in the 1980s to estimate, correctly, the approximate value of the cosmological constant a decade before it was inferred observationally from the velocities of distant supernovae. It’s a depressing kind of solution to the problem, he accepts. “But as I’ve said: there are many conditions that we impose on the laws of nature such as logical consistency, but we don’t have the right to impose the condition that the laws should be such that they make us happy!”

Weinberg’s outlook on the field today is pretty much as it was in 1979 when he gave his Nobel-prize lecture. The only big difference would be string theory, he says, which hadn’t yet come along as a possible theory of everything. “Apart from that, I said about the future beyond the SM that I think it’s unfortunate that there isn’t a clear idea to break into it.” Even the discovery of neutrino masses, inferred from the observation of neutrino oscillations 20 years ago, does not threaten the SM, he says. On the contrary: neutrino masses are what you expect.

Neutrinos and new physics

By the time Weinberg received his Nobel prize in late 1979, he had arrived at a more nuanced interpretation of field theory and described it in a paper titled “Phenomenological Lagrangians”. Building on the work of others, such as Schwinger, it presented the SM as the leading term in an “effective” field theory that is merely a low-energy manifestation of a deeper microscopic theory that we are yet to uncover. In this more modern view, field theories don’t have to be renormalisable to be logically consistent but can contain, in addition to the renormalisable terms, a slew of non-renormalisable terms that are suppressed by negative powers of some very large mass (corresponding to the scale at which the true theory applies).

For neutrinos, treating the SM as an effective field theory has major implications. Whereas simply inserting neutrino masses into the theory would violate the SU(2)×U(1) symmetry, Weinberg realised that there is an interaction between leptons and the Higgs doublet that avoids this. Crucially, since the interaction is non-renormalisable, it is suppressed by a very large-mass denominator – explaining both the existence of neutrino masses and their smallnesses and giving rise to what is more generally called the seesaw mechanism.

“In a sense it is beyond the SM, but I would rather say it is beyond the leading terms – the renormalisable, unsuppressed part of the SM,” says Weinberg. “But hell – so is gravity! The symmetries of general relativity don’t allow any renormalisable interactions of massless spin-2 particles called gravitons. We know about gravity even though it’s incredibly strongly suppressed only because it has this property of adding up: every atom in the Earth attracts a falling body, always pulling in the same direction. If it wasn’t for that fact, we wouldn’t know about its existence from experiments – certainly not at experiments at the LHC.” Of course, neutrinos were still thought massless back in 1979. Weinberg does not take credit for predicting neutrino masses, but he thinks it’s the right interpretation. What’s more, he says, the non-renormalisable interaction that produces the neutrino masses is probably also accompanied with non-renormalisable interactions that produce proton decay and other things that haven’t been observed, such as violation of baryon-number conservations. “We don’t know anything about the details of those terms, but I’ll swear they are there.”

As to what is the true high-energy theory of elementary particles, Weinberg says string theory is still the best hope we have. “I am glad people are working on string theory and trying to explore it, although I notice that the smart guys such as Witten seem to have turned their attention to solid-state physics lately. Maybe that’s a sign that they are giving up, but I hope not.” Weinberg worked on strings himself in the late 1980s, writing a couple of papers “of stunning unimportance”, but decided not to devote his career to it. As is documented in his 1992 book Dreams of a Final Theory, and much earlier in his Nobel lecture, he has his own hunch about an ultimate microscopic theory of nature, rooted in an idea called “asymptotic safety”. Weinberg also still holds hope that one day a paper posted in the arXiv preprint server by some previously unknown graduate student will turn the SM on its head – a 21st century model of particles “that incorporates dark matter and dark energy and has all the hallmarks of being a correct theory, using ideas no one had thought of before”.

Until that day comes, particle physicists have to be content with scouring the TeV energy scale at the LHC for new particles and with subjecting the SM to increasingly precise tests – not just at the LHC but at numerous other experiments at CERN and beyond. The field also faces a critical decision in the next few years as to what the next big-ticket collider experiment should be: an electron–positron collider, which potentially comes in straight or circular varieties, or a more energetic hadron collider. Most of the options on the table have precision measurements of the Higgs boson as part of their physics cases.

Next steps for the field

Weinberg says he doesn’t have an educated opinion on which machine should come next. “It hinges partly on what the experimentalists can actually accomplish, and I’m not equipped to judge that. And it hinges also on what the new physics is, and I’m not equipped to judge that!” he says. “If I had a very specific proposal for beyond-the-SM then it might indicate in which of these directions we should go, but I don’t know of any proposal that is attractive enough to go one way or the other.” Although he would like to see the Higgs, the first scalar particle discovered, measured in more detail, he fears that it will just confirm the simplest picture of the Higgs mechanism. “Because that’s what you get if you insist on a renormalisable theory, and I think it’s correct to do so for reasons I was beginning to understand in 1979.”

He is glad that CERN is continuing with the LHC, and that the US is doing neutrino-oscillation experiments, “which seems to now be the American style having given up on the SSC”. Another topic he thinks should be pushed more is the search for baryon non-conservation (proton decay) on Earth. But the most promising arena for progress these days is astronomy, he says. After all, otherwise we wouldn’t even dream of the existence of dark matter and dark energy, and it’s an area where experiment is still very fruitful – as evidenced by the recent discovery of gravitational waves. “My goodness, that’s the most exciting thing – studying gravitational radiation not just for its own sake but opening up a whole area of astronomy.” Cosmology, along with the foundational issues of quantum mechanics, is the subject of Weinberg’s own recent work, and he is currently putting together a paper with co-worker Raphael Flaugera on the effect of the intergalactic medium on gravitational waves from distance sources.

Reflecting on physics

It is beyond doubt that Weinberg’s 1967 paper was game-changing, but does he himself rate it as his most important contribution to physics? “Oh I don’t know. I don’t like praising my own papers. The 1967 paper was part of a programme of many decades – a concern with symmetries, especially broken ones – which went back to the early 1960s, at least to my paper with Goldstone and Salam raising the issue of massless Goldstone bosons, after which Higgs et al. showed us how to avoid them. I guess mine was a key paper in that department, but I had been working on broken symmetries in the context of strong interactions for a decade. Then there is the development of effective theories, which is not so much a theory but a point of view.” What he prizes above all else, however, is not embodied in any single paper or certainly not in any single model: it is about changing the point of view of physicists.

“I prized the 1967 paper programmatically because it exemplified the need to look for a renormalisable theory based on symmetries that are spontaneously broken, and by god it turned out to be the right model!” From the start, he knew his model of leptons was the kind of theory that looked right, but it doesn’t just take a certain mind to be able to see such truths, he explains. It takes a lot of minds over a long period. Weinberg illustrates the process with the example of chicken sexing. It is important in the poultry business to be able to determine the sex of newborn chicks, he says, and there was a school that taught the science of chicken sexing by giving people a newborn chick and asking them to say whether it was male of female: if they guessed wrong, they would receive some sort of punishment, while if they guessed right they got some kind of reward. After repeating the process hundreds of times, people began to guess correctly. “We’ve learned that, just as feeling the underside of a newborn chicken you might learn what distinguishes a male from a female, science is not the experience of just one scientist but of the whole community extending back to antiquity, which has gradually beaten into us which theory is beautiful and which is likely to be right.”

Asked what single mystery, if he could choose, he would like to see solved in his lifetime, Weinberg doesn’t have to think for long: he wants to be able to explain the observed pattern of quark and lepton masses. In the summer of 1972, when the SM was coming together, he set himself the task of figuring it out but couldn’t come up with anything. “It was the worst summer of my life! I mean, obviously there are broader questions such as: why is there something rather than nothing? But if you ask for a very specific question, that’s the one. And I’m no closer now to answering it than I was in the summer of 1972,” he says, still audibly irritated. He also doesn’t want to die without knowing what dark matter is. There are all kinds of frustrations, he says. “But how could it be otherwise? I am enjoying what I am doing and I have had a good run, and I have a few more years. We’re having a total eclipse here in April 2024 and I look forward to seeing that.”

Forty years after the publication of his famous book The First Three Minutes, which has been translated into 22 languages and for which he still receives royalty cheques, he intends to go on writing. He has a contract with Cambridge University Press to publish a new book called Lectures on Astrophysics based on his current teaching activities  and is bringing out a third collection of popular essays with Harvard University Press, with a fourth planned.

The last three minutes

Steven Weinberg’s career – from his undergraduate days at Cornell, graduate studies in Princeton, and subsequent positions at Columbia, Berkeley, Harvard, MIT and Texas – is one that any physicist would aspire to. His name will always be associated with our fundamental understanding of the universe, and you get the feeling that none of it ever felt much like “work” in the usual sense. “The physics career, quite apart from what you do in physics day to day, has given me the opportunity to know a lot of interesting people and to visit different countries not as a tourist but as a co-worker,” he says. “It’s such a delight to talk to fellow physicists and to work-up a paper based on a common understanding, and to at the same time transcend national boundaries. I like that so much.” Not that he has collaborated that much: as with “A Model of Leptons”, most of his 350 or so papers have been “one-man jobs”.

Yet Weinberg is not your stereotypical lost-in-his-work genius who locks himself away for long periods to work on a problem. His best ideas don’t come to him while he’s working at all. He recalls one day he came out of the shower and exclaimed to his wife that he had figured out why the cosmological constant is so small (at a time before he had started thinking about anthropic explanations). “Then the next day I came out and I said [deep voice] ‘no’! So ideas come to you all the time and most of them are no good, and every once in a while you find one that is good and you have fun working at your desk. Getting good ideas isn’t something you get by trying hard, but by thinking a lot about what problems bother you. But that doesn’t always work either – just think of my ruined summer in 1972!”

He never works in his office. His research work has always been done at home, where he and his wife have separate offices down the hall from one another and interrupt one another frequently. “I’m not hard to interrupt. I have a television set on my desk which I keep on while I work, typically watching an old movie, because I find work in theoretical physics so far removed from normal affairs.” Doesn’t it distract him? “But I need the distraction to keep at my desk because the actual work is so, well…it’s so chillingly non-human. I need to feel that I am still part of the human race while I’m doing it.”

Facing up to the exabyte era

The high-luminosity Large Hadron Collider (HL-LHC) will dramatically increase the rate of particle collisions compared with today’s machine, boosting the potential for discoveries. In addition to extensive work on CERN’s accelerator complex and the LHC detectors, this second phase in the LHC’s life will generate unprecedented data challenges.

The increased rate of collisions makes the task of reconstructing events (piecing together the underlying collisions from millions of electrical signals read out by the LHC detectors) significantly more complex. At the same time, the LHC experiments are planning to employ more flexible trigger systems that can collect a greater number of events. These factors will drive a huge increase in computing needs for the start of the HL-LHC era in around 2026. Using current software, hardware and analysis techniques, the required computing capacity is roughly 50–100 times higher than today, with data storage alone expected to enter the exabyte (1018 bytes) regime.

It is reasonable to expect that technology improvements over the next seven to 10 years will yield an improvement of around a factor 10 in both processing and storage capabilities for no extra cost. While this will go some way to address the HL-LHC’s requirements, it will still leave a significant deficit. With budgets unlikely to increase, it will not be possible to solve the problem by simply increasing the total computing resources available. It is therefore vital to explore new technologies and methodologies in conjunction with the world’s leading information and communication technology (ICT) companies.

CERN openlab, which was established by the CERN IT department in 2001, is a public–private partnership that enables CERN to collaborate with ICT companies to meet the demands of particle-physics research. Since the start of this year, CERN openlab has carried out an in-depth consultation to identify the main ICT challenges faced by the LHC research community over the coming years. Based on our findings, we published a white paper in September on future ICT challenges in scientific research.

The paper identifies 16 ICT challenge areas that need to be tackled in collaboration with industry, and these have been grouped into four overarching R&D topics. The first focuses on data-centre technologies to ensure that: data-centre architectures are flexible and cost effective; cloud-computing resources can be used in a scalable, hybrid manner; new technologies for solving storage-capacity issues are thoroughly investigated; and long-term data-storage systems are reliable and economically viable. The second major R&D topic relates to the modernisation of code, so that the maximum performance can be achieved on the new hardware platforms available. The third R&D topic focuses on machine learning, in particular its potentially large role in monitoring the accelerator chain and optimising the use of ICT resources.

The fourth R&D topic in the white paper identifies ICT challenges that are common across research disciplines. With ever more research fields such as astrophysics and biomedicine adopting big-data methodologies, it is vital that we share tools and learn from one another – in particular to ensure that leading ICT companies are producing solutions that meet our common needs.

In summary, CERN openlab has identified ICT challenges that must be tackled over the coming years to ensure that physicists worldwide can get the most from CERN’s infrastructure and experiments. In addition, the white paper demonstrates the emergence of new technology paradigms, from pervasive ultra-fast networks of smart sensors in the “internet of things”, to machine learning and “smart everything” paradigms. These technologies could revolutionise the way big science is done, particularly in terms of data analysis and the control of complex systems, and also have enormous potential for the benefit of wider society. CERN openlab, with its unique collaboration with several of the world’s leading IT companies, is ideally positioned to help make this a reality.

• openlab.cern.

Health Physics: Radiation-Generation Devices, Characteristics, and Hazards

By Joseph John Bevelacqua
Wiley-VCH

176406_2015_07_Bevelacqua_HC_EM_COMP_LAY1.indd

When developing technologies involving the use of nuclear material or ionisation radiation, a number of safety issues and potential risks have to be addressed. The author of this book, a certified health physicist and an expert in radiation protection, discusses these emerging topics related to radiation-generating technologies and associated hazards.

The book opens with a brief overview of modern radiation-protection challenges, before delving into specific areas. First, the author discusses the nuclear-fuel cycle, analysing its steps and related issues such as reactors, new technologies for uranium enrichment and waste disposal. In the following section, he deals with nuclear accidents and radiological emergencies – making specific reference to the well-known disasters of Three Mile Island, Chernobyl and Fukushima Daiichi – and with the risk of terrorist events involving sabotage or the use of improvised nuclear weapons and devices.

Today, nuclear material is also largely employed for medical imaging and therapies, thus a part of the book is devoted to these technologies and to the consequent increase of public radiation exposure. Finally, the last section focuses on regulatory issues, limitations and challenges.

Meant for upper-level undergraduate and graduate students of health-physics and engineering courses, the book would also be a useful reference for scientists and professionals working in radiation protection, fuel-cycle technology and nuclear medicine. More than 300 problems with solutions accompany the text and many appendices provide background information.

Neutrino Astronomy: Current Status, Future Prospects

By T Gaisser and A Karle (eds)
World Scientific

51UNUvRgG9L._SX342_SY445_QL70_ML2_

This review volume is motivated by the 2014 observation of a high-energy neutrino flux of extraterrestrial origin by the IceCube experiment at the South Pole. The energy of the events recorded ranges from 30 to 2000 TeV, with the latter marking the highest-energy neutrino interaction ever observed. The study of neutrinos originating from violent astrophysical sources enhances our knowledge not only of cosmological phenomena but also of neutrinos themselves.

This book gives an overview of the current status of research in the field and of existing and future neutrino observatories. The first group of chapters present the physics of potential sources of high-energy neutrinos, including gamma-ray bursts, active galactic nuclei, star-forming galaxies and sources in the Milky Way. A chapter is then dedicated to the measurements performed by IceCube, the results of which are discussed in terms of energy spectrum, flavour-ratio and arrival-direction isotropy. Following this, the results of two deep-sea neutrino experiments, ANTARES and Baikal, are presented.

After a brief discussion of other research topics in which the study of high-energy astrophysical neutrinos can play an important role, such as the quest for dark matter, the book examines the next generation of cosmic neutrino detectors. In particular, the future KM3NeT experiment, which will consist of a network of underwater telescopes located in the Mediterranean Sea, and IceCube-Gen2, characterised by unprecedented sensitivity and higher angular resolution compared to IceCube, are described.

Finally, a review of present and in-planning experiments aiming at detecting radio emissions from high-energy neutrino interactions concludes the volume.

An Introduction to Gauge Theories

By N Cabibbo, L Maiani and O Benhar
CRC Press

CCboo1_08_17

There is always great excitement among the academic community when a new book by renowned scientists is published. Written by leading experts in particle physics, this book by Luciano Maiani and Omar Benhar, with contributions from the late Nicola Cabibbo, does not disappoint in this regard. Former CERN Director-General Maiani co-proposed the GIM mechanism, which is required to suppress flavour-changing neutral currents at the tree level and assumed the existence of a fourth quark that was discovered in 1974 at SLAC and BNL, while Cabibbo proposed a solution to the puzzle of electroweak decays of strange particles, which was later extended to give rise to the Cabibbo–Kobayashi–Maskawa mixing matrix. Omar Benhar, an INFN research director and professor at the University of Rome “La Sapienza”, is expert in the theory of many-particle systems, the structure of compact stars and electroweak interactions of nuclei.

Their book is the third volume of a series dedicated to relativistic quantum mechanics, gauge theories and electroweak interactions, based on material taught to graduate students at the University of Rome over a period of several decades. Given that gauge theories are the basis of interactions between elementary particles, it is not surprising that there are many books about gauge theories already out there – among the best are those written by Paul Frampton, J R Aitchison and Anthony Hey, Chris Quigg, Ta-Pei Cheng and Ling-Fong Li. One might therefore think that it is hard to add something new to the field, but this book introduces the reader in a concise and elegant manner to a modern account of the fundamentals of renormalisation in quantum field theories and to the concepts underlying gauge theories.

Containing more than 300 pages organised in 20 chapters and several appendices, the book focuses mainly on quantum electrodynamics (QED), which – despite its simplicity and limitations – serves as the mould of a gauge theory and at the same time it has a high predictive power and numerous applications. The first part of this treatise deals with the quantisation of QED via the path-integral method, from basic to advanced concepts, followed by a brief discussion on the renormalisation of QED and some of its applications, such as bremsstrahlung, the Lamb shift, and the electron anomalous magnetic moment. The prediction of the latter is considered one of the great achievements of QED.

In the second part of the book, the authors cover the renormalisation group equations of QED and introduce the quantisation of non-Abelian gauge theories, finishing with a proof of the asymptotic freedom of quantum chromodynamics. Afterwards, the concept of the running coupling constant is used to introduce a few ideas about grand unification. The final chapters are devoted to concepts related to the Standard Model of particle physics, such as the Higgs mechanism and the electroweak corrections to the muon anomalous magnetic moment. Finally, a few useful formulas and calculations are provided in several appendices.

Throughout the book the authors not only present the mathematical framework and cover basic and advanced concepts of the field, but also introduce several physical applications. The most recent discoveries in the field of particle physics are discussed. This is a book targeted at advanced students accustomed to mental challenges. A minor flaw is the lack of problems at the end of the chapters, which would offer students the possibility to apply the acquired knowledge, although the authors do encourage readers to complete a few demonstrations. This text will be very helpful for students and teachers interested in a treatment of the fundamentals of gauge theories via a concise and modern approach in the constantly changing world of particle physics.

Centennial of General Relativity: A Celebration

By César Augusto Zen Vasconcellos (ed.)
World Scientific

514-rF-ng-L._SX312_BO1,204,203,200_

In 1915 Albert Einstein presented to the Royal Prussian Academy of Sciences his theory of general relativity (GR), which represented a breakthrough in modern physics and became the foundation of our understanding of the universe at large. A century later, this elegant theory is still the basis of the current description of gravitation and a number of predictions derived from it have been confirmed in observations and experiments – most recently with the direct detection of gravitational waves.

This book celebrates the centenary of GR with a collection of 11 essays by different experts, which offer an overview of the theory and its numerous astrophysical and cosmological implications. After an introduction to GR, the Tolman–Oppenheimer–Volkoff equations describing the structure of relativistic compact stars are derived and their extension to deformed compact stellar objects presented. The book then moves to the so-called pc-GR theory, in which GR is algebraically extended to pseudo-complex co-ordinates in an attempt to get around singularities. Other topics covered are strange matter, in particular a conjecture that pulsar-like compact stars may be made of a condensed three-flavour quark state, and the use of a particular solution of the GR equations to construct multiple non-spherical cosmic structures.

Keeping the book contemporary, it also gives an overview of the most recent experimental results in particle physics and cosmology. Several contributions are devoted to the search for physics beyond the Standard Model at CERN, studies of cosmic objects and phenomena through gamma-ray lenses and, finally, to the recent detection of gravitational waves by the LIGO experiment.

bright-rec iop pub iop-science physcis connect