Comsol -leaderboard other pages

Topics

LHC reinterpreters think long-term

A Map of the Invisible

The ATLAS, CMS and LHCb collaborations perform precise measurements of Standard Model (SM) processes and direct searches for physics beyond the Standard Model (BSM) in a vast variety of channels. Despite the multitude of BSM scenarios tested this way by the experiments, it still constitutes only a small subset of the possible theories and parameter combinations to which the experiments are sensitive. The (re)interpretation of the LHC results in order to fully understand their implications for new physics has become a very active field, with close theory–experiment interaction and with new computational tools and related infrastructure being developed. 

From 15 to 19 February, almost 300 theorists and experimental physicists gathered for a week-long online workshop to discuss the latest developments. The topics covered ranged from advances in public software packages for reinterpretation to the provision of detailed analysis information by the experiments, from phenomenological studies to global fits, and from long-term preservation to public data.

Open likelihoods

One of the leading questions throughout the workshop was that of public likelihoods. The statistical model of an experimental analysis provides its complete mathematical description; it is essential information for determining the compatibility of the observations with theoretical predictions. In his keynote talk “Open science needs open likelihoods’’, Harrison Prosper (Florida State University) explained why it is in our scientific interest to make the publication of full likelihoods routine and straightforward. The ATLAS collaboration has recently made an important step in this direction by releasing full likelihoods in a JSON format, which provides background estimates, changes under systematic variations, and observed data counts at the same fidelity as used in the experiment, as presented by Eric Schanet (LMU Munich). Matthew Feickert (University of Illinois) and colleagues gave a detailed tutorial on how to use these likelihoods with the pyhf python package. Two public reinterpretation tools, MadAnalysis5 presented by Jack Araz (IPPP Durham) and SModelS presented by Andre Lessa (UFABC Santo Andre) can already make use of pyhf and JSON likelihoods, and others are to follow. An alternative approach to the plain-text JSON serialisation is to encode the experimental likelihood functions in deep neural networks, as discussed by Andrea Coccaro (INFN Genova) who presented the DNNLikelihood framework. Several more contributions from CMS, LHCb and from theorists addressed the question of how to present and use likelihood information, and this will certainly stay an active topic at future workshops.  

The question of making research data findable, accessible, interoperable and reusable is a burning one throughout modern science

A novelty for the Reinterpretation workshop was that the discussion was extended to experiences and best practices beyond the LHC, to see how experiments in other fields address the need for publicly released data and reusable results. This included presentations on dark-matter direct detection, the high-intensity frontier, and neutrino oscillation experiments. Supporting Prosper’s call for data reusability 40 years into the future – “for science 2061” – Eligio Lisi (INFN Bari) pointed out the challenges met in reinterpreting the 1998 Super-Kamiokande data, initially published in terms of the then-sufficient two-flavour neutrino-oscillation paradigm, in terms of contemporary three-neutrino descriptions, and beyond. On the astrophysics side, the LIGO and Virgo collaborations actively pursue an open-science programme. Here, Agata Trovato (APC Paris) presented the Gravitational Wave Open Science Center, giving details on the available data, on their format and on the tools to access them. An open-data policy also exists at the LHC, spearheaded by the CMS collaboration, and Edgar Carrera Jarrin (USF Quito) shared experiences from the first CMS open-data workshop. 

The question of making research data findable, accessible, interoperable and reusable (“FAIR” in short) is a burning one throughout modern science. In a keynote talk, the head of the GO FAIR Foundation, Barend Mons, explained the FAIR Guiding Principles together with the technical and social aspects of FAIR data management and data reuse, using the example of COVID-19 disease modelling. There is much to be learned here for our field. 

The wrap-up session revolved around the question of how to implement the recommendations of the Reinterpretation workshop in a more systematic way. An important aspect here is the proper recognition, within the collaborations as well as the community at large, of the additional work required to this end. More rigorous citation of HEPData entries by theorists may help in this regard. Moreover, a “Reinterpretation: Auxiliary Mat­erial Presentation” (RAMP) seminar series will be launched to give more visibility and explicit recognition to the efforts of preparing and providing extensive mat­erial for reinterpretation. The first RAMP meetings took place on 9 and 23 April.

CMS seeks support for Lebanese colleagues

Lebanese scientists at CERN

The CMS collaboration, in partnership with the Geneva-based Sharing Knowledge Foundation, has launched a fundraising initiative to support the Lebanese scientific community during an especially difficult period. Lebanon signed an international cooperation agreement with CERN in 2016, which triggered a strong development of the country’s contributions to CERN projects, particularly to the CMS experiment through the affiliation of four of its top universities. Yet the country is dealing with an unprecedented economic crisis, food shortages, Syrian refugees and the COVID-19 pandemic, all in the aftermath of the Beirut port explosion in August 2020.

“Even the most resilient higher-education institutions in Lebanon are struggling to survive,” says CMS collaborator Martin Gastal of CERN, who initiated the fundraising activity in March. “Despite these challenges, the Lebanese scientific community has reaffirmed its commitment to CERN and CMS, but it needs support.”

One project, High-Performance Computing for Lebanon (HPC4L), which was initiated to build Lebanon’s research capacity while contributing as a Tier-2 centre to the analysis of CMS data, is particularly at risk. HPC4L was due to benefit from servers donated by CERN to Lebanon, and from the transfer of CERN and CMS knowledge and expertise to train a dedicated support team that will run a high-performance computing facility there. But the hardware has been unable to be shipped from CERN because of a lack of available funding. CMS and the Sharing Knowledge Foundation are therefore fundraising to cover the shipping costs of the donated hardware, to purchase hardware to allow its installation, and to support Lebanese experts while they are trained at CERN by the CMS offline computing team.

“At this pivotal moment, every effort to help Lebanon counts,” says Gastal. “CMS is reaching out for donations to support this initiative, to help both the Lebanese research community and the country itself.”

More information, including how to get involved, can be found at: cern.ch/fundraiser-lebanon. 

Calculating the curiosity windfall

Magnet R&D

Recent decades have seen an emphasis on the market and social value of fundamental science. Increasingly, researchers must demonstrate the benefits of their work beyond the generation of pure scientific knowledge, and the cultural benefits of peaceful and open international collaboration.

This timely collection of short essays by leading scientific managers and policymakers, which emerged from a workshop held during Future Circular Collider (FCC) Week 2019, brings the interconnectedness of fundamental science and economics into focus. Its 18 contributions range from procurement to knowledge transfer, and from global-impact assessments to case studies from CERN, SKA, the ESS and ESA, with a foreword by former CERN Director-General Rolf Heuer. As such, it constitutes an important contribution to the literature and a guide for future projects such as a post-LHC collider.

As the number and size of research infrastructures (RIs) has grown over the years, describes CERN’s head of industry, procurement and knowledge transfer, Thierry Lagrange, the will to push the frontier of knowledge has required significant additional public spending linked to the development and upgrade of high-tech instruments, and increased maintenance costs. The socioeconomic returns to society are clear, he says. But these benefits are not generated automatically: they require a thriving ecosystem that transfers knowledge and technologies to society, aided by entities such as CERN’s knowledge transfer group and business incubation centres.

RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures

Multi-billion public investments in RIs are justified given their crucial and multifaceted role in society, asserts EIROforum liaison officer at the European Commission, Margarida Ribeiro. She argues that new RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures, adequate long-term funding, closer engagement with industry, and methodologies for assessing RI impact. All contributors acknowledge the importance of this latter point. While physicists would no doubt prefer to go back to the pre-Cold War days of doing science for science’s sake, argues ESS director John Womersley, without the ability to articulate the socioeconomic justifications of fundamental science as a driver of prosperity, jobs, innovation, startups and as solutions to challenges such as climate change and the environment, it is only going to become more difficult for projects to get funding.

A future collider is a case in point. Johannes Gutleber of CERN and the FCC study describes several recent studies seeking to quantify the socioeconomic value of the LHC and its proposed successor, the FCC, with training and industrial innovation emerging as the most important generators of impact. The rising interest in the type of RI benefits that emerge and how they can be maximised and redistributed to society, he writes, is giving rise to a new field of interdisciplinary research, bringing together economists, social scientists, historians and philosophers of science, and policymakers.

Nowhere is this better illustrated than the ongoing programme led by economists at the University of Milan, described in two chapters by Florio Massimo and Andrea Bastianin. A recent social cost–benefit analysis of the HL-LHC, for example, conservatively estimates that every €1 of costs returns €1.2 to society, while a similar study concerning the FCC estimates the benefit/cost ratio to be even higher, at 1.8. Florio argues that CERN and big science more generally are ideal testing grounds for theoretical and empirical economic models, while demonstrating the positive net impact that large colliders have for society. His 2019 book Investing in Science: Social Cost-Benefit Analysis of Research Infrastructures (MIT Press) explores this point in depth (CERN Courier September 2018 p51), and is another must-read in this growing interdisciplinary area. Completing the series of essays on impact evaluation, Philip Amison of the UK’s Science and Technology Facilities Council reviews the findings of a report published last year capturing the benefits of CERN membership.

The final part of the volume focuses on the question “Who benefits from such large public investments in science?”, and addresses the contribution of big science to social justice and inequalities. Carsten Welsch of the University of Liverpool/Cockcroft Institute argues that fundamental science should not be considered as a distant activity, illustrating the point convincingly via the approximately 50,000 particle accelerators currently used in industry, medical treatments and research worldwide.

The grand ideas and open questions in particle physics and cosmology already inspire many young people to enter STEM subjects, while technological spin-offs such as medical treatments, big-data handling, and radio-frequency technology are also often communicated. Less well known are the significant but harder-to-quantify economic benefits of big science. This volume is therefore essential reading, not just for government ministers and policymakers, but for physicists and others working in curiosity-driven research who need to convey the immense benefits of their work beyond pure knowledge.

Roger J N Phillips 1931–2020

R Phillips

The eminent theoretical physicist Roger Julian Noel Phillips died peacefully on 4 September 2020, aged 89, at his home in Abingdon, UK. Roger was educated at Trinity College, Cambridge, where he received his PhD in 1955. His thesis advisor was Paul Dirac. Roger transferred from the Harwell theory group to the Rutherford Appleton Laboratory (RAL) in 1962 where he led the theoretical high-energy physics group to international prominence. He also held visiting appointments at CERN, Berkeley, Madison and Riverside.

Roger was a giant in particle physics phenomenology and his book “Collider Physics” (Addison-Wesley, 1987), co-authored with his longstanding collaborator Vernon Barger, remains a classic. In 1990 Roger was awarded the Ernest Rutherford Prize & medal of the UK Institute of Physics. To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built. To theorists, he was renowned for his deep understanding of particle-physics models. A career-long collaboration across the Atlantic with Barger ensued from their sharing an office at CERN in 1967. Their initial focus was the Regge-pole model to describe high-energy scattering of hadrons. Subsequently they inferred the momentum distribution of the light quarks and gluons from deep-inelastic scattering data and made studies to identify the charm-quark signal in a Fermilab neutrino experiment.

To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built

In 1980, Phillips and collaborators discovered the resonance in neutrino oscillations when neutrinos propagate long distances through matter. This work is the basis of the ongoing Fermilab long-baseline neutrino program that will make precision determinations of neutrino masses and mixing. From 1983, Phillips and his collaborators developed pioneering strategies in collider physics for finding the W boson, the top quark, the Higgs boson and searches for physics beyond the Standard Model. In an influential 1990 publication, Phillips, Hewett and Barger showed that the decay of a b-quark to an s-quark and a photon is a highly sensitive probe of a charged Higgs boson through its one-loop virtual contribution.

After retiring in 1997, Roger maintained an active interest in particle physics. He struggled with Parkinson’s disease in recent years but continued to live with determination, wit and cheer. He joked that his Parkinson’s tremor made his mouse and keyboard run wild: “I know that an infinite number of random monkeys can eventually write Shakespeare, but I can’t wait that long!” One of his very last whispers to his son David was: “There are symmetries in mathematics which are like aspects of dreaming”. He did great things with his brain when he was alive that will continue as he donated his to the Parkinson’s UK Brain Bank.

Roger was highly respected for his intellectual brilliance, physics leadership and immense integrity, but also for his modesty and generosity in going out of his way to help others. He was a delight to work with and an inspiration to all who knew him. He is missed by his many friends around the world.

The Science of Learning Physics

A greying giant of the field speaks to the blackboard for 45 minutes before turning, dismissively seizing paper and scissors, and cutting a straight slit. The sheet is twisted to represent the conical space–time described by the symbols on the board. A lecture theatre of students is transfixed in admiration.

This is not the teaching style advocated by José Mestre and Jennifer Docktor in their new book The Science of Learning Physics. And it’s no longer typical, say the authors, who suggest that approximately half of physics lecturers use at least one “evidence-based instructional practice” – jargon, most often, for an interactive teaching method. As colleagues joked when I questioned them on their teaching styles, there is still a performative aspect to lecturing, but these days it is just as likely to reflect the rock-star feeling of having a hundred camera phones pointed at you – albeit so the students can snap a QR code on your slide to take part in an interactive mid-lecture quiz.

Swiss and Soviet developmental psychologists Jean Piaget and Lev Vygotsky are duly namechecked

Mestre and Docktor, who are both educational psychologists with a background in physics, offer intriguing tips to maximise the impact of such practices. After answering a snap poll, they say, students should discuss with their neighbour before being polled again. The goal is not just to allow the lecturer to tailor their teaching, but also to allow students to “construct” their knowledge. Lecturing, they say, gives piecemeal information, but does not connect it. Neurons fire, but synaptic connections are not trained. And as the list of neurotransmitters that reinforce synaptic connections includes dopamine and serotonin, making students feel good by answering questions correctly may be worth the time investment.

Relative to other sciences, physics lecturers are leading the way in implementing evidence-based instructional practices, but far too few are well trained, say Mestre and Docktor, who want to bring the tools and educational philosophies of the high-school physics teacher to the lecture theatre. Swiss and Soviet developmental psychologists Jean Piaget and Lev Vygotsky are duly namechecked. “Think–pair–share”, mini whiteboards and flipping the classroom (not a discourteous gesture but the advance viewing of pre-recorded lectures before a more participatory lecture), are the order of the day. Students are not blank slates, they write, but have strong attachments to deeply ingrained and often erroneous intuitions that they have previously constructed. Misconceptions cannot be supplanted wholesale, but must be unknotted strand by strand. Lecturers should therefore explicitly describe their thought processes and encourage students to reflect on “metacognition”, or “thinking about thinking”. Here the text is reminiscent of Nobelist Daniel Kahneman’s seminal text Thinking, Fast and Slow, which divides thinking into two types: “system 1”, which is instinctive and emotional, and “system 2”, which is logical but effortful. Lecturers must fight against “knee-jerk” reasoning, say Mestre and Docktor, by modelling the time-intensive construction of knowledge, rather than aspiring to misleading virtuoso displays of mathematical prowess. Wherever possible, this should be directly assessed by giving marks not just for correct answers, but also for identifying the “big idea” and showing your working.

Disappointingly, examples are limited to pulleys and ramps, and, somewhat ironically, the book’s dusty academic tone may prove ineffective at teaching teachers to teach. But no other book comes close to The Science of Learning Physics as a means for lecturers to reflect on and enrich their teaching strategies, and it is highly recommend on that basis. That said, my respect for my old general-relativity lecturer remained undimmed as I finished the last page. Those old-fashioned lectures were hugely inspiring – a “non-cognitive aspect” that Mestre and Docktor admit their book does not consider.

Still seeking solutions

How did winning a Special Breakthrough Prize last year compare with the Nobel Prize?

Steven Weinberg

It came as quite a surprise because as far as I know, none of the people who have been honoured with the Breakthrough Prize had already received the Nobel Prize. Of course nothing compares with the Nobel Prize in prestige, if only because of the long history of great scientists to whom it has been awarded in the past. But the Breakthrough Prize has its own special value to me because of the calibre of the young – well, I think of them as young – theoretical physicists who are really dominating the field and who make up the selection committee.

The prize committee stated that you would be a recognised leader in the field even if you hadn’t made your seminal 1967 contribution to the genesis of the Standard Model. What do you view as Weinberg’s greatest hits?

There’s no way I can answer that and maintain modesty! That work on the electroweak theory leading to the mass of the W and Z, and the existence and properties of the Higgs, was certainly the biggest splash. But it was rather untypical of me. My style is usually not to propose specific models that will lead to specific experimental predictions, but rather to interpret in a broad way what is going on and make very general remarks, like with the development of the point of view associated with effective field theory. Doing this I hope to try and change the way my fellow physicists look at things, without usually proposing anything specific. I have occasionally made predictions, some which actually worked, like calculating the pion–nucleon and pion–pion scattering lengths in the mid-1960s using the broken symmetry that had been proposed by Nambu. There were other things, like raising the whole issue of the cosmological constant before the discovery of the accelerated expansion of the universe. I worried about that – I gave a series of lectures at Harvard in which I finally concluded that the only way I can understand why there isn’t an enormous vacuum energy is because of some kind of anthropic selection. Together with two guys here at Austin, Paul Shapiro and Hugo Martel, we worked out what was the most likely value that would be found in terms of order of magnitude, which was later found to be correct. So I was very pleased that the Breakthrough Prize acknowledged some of those things that didn’t lead to specific predictions but changed a general framework.

I wish I could claim that I had predicted the neutrino mass

You coined the term effective field theory (EFT) and recently inaugurated the online lecture series All Things EFT. What is the importance of EFT today?

My thinking about EFTs has always been in part conditioned by thinking about how we can deal with a quantum theory of gravitation. You can’t represent gravity by a simple renormalisable theory like the Standard Model, so what do you do? In fact, you treat general relativity the same way you treat low-energy pions, which are described by a low-energy non-renormalisable theory. (You could say it’s a low-energy limit of QCD but its ingredients are totally different – instead of quarks and gluons you have pions). I showed how you can generate a power series for any given scattering amplitude in powers of energy rather than some small coupling constant. The whole idea of EFT is that any possible interaction is there: if it’s not forbidden it’s compulsory. But the higher, more complicated terms are suppressed by negative powers of some very large mass because the dimensionality of the coupling constants is such that they have negative powers of mass, like the gravitational constant. That’s why they’re so weak.

If you recognise that the Standard Model is probably a low-energy limit of some more general theory, then you can consider terms that make the theory non-renormalisable and generate corrections to it. In particular, the Standard Model has this beautiful feature that in its simplest renormalisable version there are symmetries that are automatic: at least to all orders of perturbation theory, it can’t violate the conservation of baryon or lepton number. But if the Standard Model just generates the first term in a power series in energy and you allow for more complicated non-renormalisable terms in the Lagrangian, then you find it’s very natural that there would be baryon and lepton non-conservation. In fact, the leading term of this sort is a term that violates lepton number and gives neutrinos the masses we observe. I wish I could claim that I had predicted the neutrino mass, but there already was evidence from the solar neutrino deficit and also it’s not certain that this is the explanation of neutrino masses. We could have Dirac neutrinos in which you have left and right neutrinos and antineutrinos coupling to the Higgs, and in that way get masses without any violation of lepton-number conservation. But I find that thoroughly repulsive because there’s no reason in that case why the neutrino masses should be so small, whereas in the EFT case we have Majorana neutrinos whose small masses are much more natural.

On this point, doesn’t the small value of the cosmological constant and Higgs mass undermine the EFT view by pointing to extreme fine-tuning?

Yes, they are a warning about things we don’t understand. The Higgs mass less so, after all it’s only about a hundred times larger than the proton mass and we know why the proton mass is so small compared to the GUT or Planck scale; it is because the proton gets it mass not from the quark masses, which have to do with the Higgs, but from the QCD forces, and we know that those become strong very slowly as you come down from high energy. We don’t understand this for the Higgs mass, which, after all, is a term in the Lagrangian, not like the proton mass. But it may be similar – that’s the old technicolour idea, that there is another coupling alongside QCD that becomes strong at some energy where it leads to a potential for the Higgs field, which then breaks electroweak symmetry. Now, I don’t have such a theory, and if I did I wouldn’t know how to test it. But there’s at least a hope for that. Whereas regards to the cosmological constant, I can’t think of anything along that line that would explain it. I think it was Nima Arkani-Hamed who said to me, “If the anthropic effect works for the cosmological constant, maybe that’s the answer with the Higgs mass – maybe it’s got to be small for anthropic reasons.” That’s very disturbing if it’s true, as we’re going to be left waving our hands. But I don’t know.

Maybe we have 2500 years ahead of us before we get to the next big step

Early last year you posted a preprint “Models of lepton and quark masses” in which you returned to the problem of the fermion mass hierarchy. How was it received?

Even in the abstract I advertise how this isn’t a realistic theory. It’s a problem that I first worked on almost 50 years ago. Just looking at the table of elementary particle masses I thought that the electron and the muon were crying out for an explanation. The electron mass looks like a radiative correction to the muon mass, so I spent the summer of 1972 on the back deck of our house in Cambridge, where I said, “This summer I am going to solve the problem of calculating the electron mass as an order-alpha correction to the muon mass.” I was able to prove that if in a theory it was natural in the technical sense that the electron would be massless in the tree approximation as a result of an accidental symmetry, then at higher order the mass would be finite. I wrote a paper, but then I just gave it up after no progress, until now when I went back to it, no longer young, and again I found models in which you do have an accidental symmetry. Now the idea is not just the muon and the electron, but the third generation feeding down to give masses to the second, which would then feed down to give masses to the first. Others have proposed what might be a more promising idea, that the only mass that isn’t zero in the tree approximation is the top mass, which is so much bigger than the others, and everything else feeds down from that. I just wanted to show the kinds of cancellations in infinites that can occur, and I worked out the calculations. I was hoping that when this paper came out some bright young physicist would come up with more realistic models, and use these calculational techniques – that hasn’t happened so far but it’s still pretty early.

What other inroads are there to the mass/flavour hierarchy problem?

The hope would be that experimentalists discover some correction to the Standard Model. The problem is that we don’t have a theory that goes beyond the Standard Model, so what we’re doing is floundering around looking for corrections in the model. So far, the only one discovered was the neutrino mass and that’s a very valuable piece of data which we so far have not figured out how to interpret. It definitely goes beyond the Standard Model – as I mentioned, I think it is a dimension-five operator in the effective field theory of which the Standard Model is the renormalisable part.

Weinberg delivering a seminar at CERN in 1979

The big question is whether we can cut off some sub-problem that we can actually solve with what we already know. That’s what I was trying to do in my recent paper and did not succeed in getting anywhere realistically. If that is not possible, it may be that we can’t make progress without a much deeper theory where the constituents are much more massive, something like string theory or an asymptotically safe theory. I still think string theory is our best hope for the future, but this future seems to be much further away than we had hoped it would be. Then I keep being reminded of Democritus, who proposed the existence of atoms in around 400 BCE. Even as late as 1900 physicists like Mach doubted the existence of atoms. They didn’t become really nailed down until the first years of the 20th century. So maybe we have 2500 years ahead of us before we get to the next big step.

Recently the LHC produced the first evidence that the Higgs boson couples to a second-generation fermion, the muon. Is there reason to think the Higgs might not couple to all three generations?

Before the Higgs was discovered it seemed quite possible that the explanation of the hierarchy problem was that there was some new technicolour force that gradually became strong as you came from very high energy to lower energy, and that somewhere in the multi-TeV range it became strong enough to produce a breakdown of the electroweak symmetry. This was pushed by Lenny Susskind and myself, independently. The problem with that theory was then: how did the quarks and leptons get their masses? Because while it gave a very natural and attractive picture of how the W and Z get their masses, it left it really mysterious for the quarks and leptons. It’s still possible that something like technicolour is true. Then the Higgs coupling to the quarks and leptons gives them masses just as expected. But in the old days, when we took technicolour seriously as the mechanism for breaking electroweak symmetry, which, since the discovery of the Higgs we don’t take seriously anymore, even then there was the question of how, without a scalar field, can you give masses to the quarks and leptons. So, I would say today, it would be amazing if the quarks and leptons were not getting their masses from the expectation value of the Higgs field. It’s important now to see a very high precision test of all this, however, because small effects coming from new physics might show up as corrections. But these days any suggestion for future physics facilities gets involved in international politics, which I don’t include in my area of expertise.

It’s still possible that something like technicolour is true

Any more papers or books in the pipeline?

I have a book that’s in press at Cambridge University Press called Foundations of Modern Physics. It’s intended to be an advanced undergraduate textbook that takes you from the earliest work on atoms, through thermodynamics, transport theory, Brownian motion, to early quantum theory; then relativity and quantum mechanics, and I even have two chapters that probably go beyond what any undergraduate would want, on nuclear physics and quantum field theory. It unfortunately doesn’t fit into what would normally be the plan for an undergraduate course, so I don’t know if it will be widely
adopted as a textbook. It was the result of a lecture course I was asked to give called “thermodynamics and quantum physics” that has been taught at Austin for years. So, I said “alright”, and it gave me a chance to learn some thermodynamics and transport theory.

Martinus J G Veltman 1931–2021

Martinus Veltman

Eminent physicist and Nobel laureate Martinus Veltman passed away in his home town of Bilthoven, the Netherlands, on 4 January. Martinus (Tini) Veltman graduated in physics at Utrecht University, opting first for experimental physics but later switching to theory. After his conscript military service in 1959, Léon Van Hove offered him a position as a PhD student. Veltman started in Utrecht, but later followed Van Hove to the CERN theory division.

CERN opened up a new world, and Tini often mentioned how he benefited from contacts with John Bell, Gilberto Bernardini and Sam Berman. The latter got him interested in weak interactions and in particular neutrino physics. During his time there, Tini spent a short period at SLAC where he started to work on his computer algebra program “Schoonschip”. He correctly foresaw that practical calculations of Feynman diagrams would become more and more complicated, particularly for theories with vector bosons. Nowadays extended calculations beyond one loop are unthinkable without computer algebra.

In 1964 Murray Gell-Mann proposed an algebra of conserved current operators for hadronic matter, which included the weak and electromagnetic currents. He argued that commutators of two currents taken at the same instant in time should “close”, meaning that these commutators can be written as linear combinations of the same set of currents. From this relation one could derive so-called sum rules that can be compared to experiments. Facing the technical problems with this approach, Tini came up with an alternative proposal. In a 1966 paper he simply conjectured that the hadronic currents for the electromagnetic and weak interactions had to be covariantly conserved, where he assumed that the weak interactions were mediated by virtual vector bosons, just as electromagnetic processes are mediated by virtual photons. The current conservation laws therefore had to contain extra terms depending on the photon field and the fields associated with the weak intermediate vector bosons. Quite surprisingly, he could then demonstrate that these new conservation equations suffice to prove the same sum rules. A more important aspect of his approach was only gradually realised, namely that the conservation laws for these currents are characteristic of a non-abelian gauge theory as had been written down more than 10 years earlier by Yang and Mills. Hence Veltman started to work on the possible renormalisability of Yang–Mills theory.

From his early days at CERN it was clear that Tini had a strong interest in confronting theoretical predictions with experimental results

Meanwhile, Veltman had left CERN towards the end of 1966 to accept a professorship at Utrecht. At the end of 1969 a prospective PhD student insisted on working on Yang–Mills theories. Veltman, who was already well aware of many of the pitfalls, only gradually agreed, and so Gerard ’t Hooft joined the programme. This turned out to be a very fruitful endeavour and the work was proudly presented in the summer of 1971. Veltman and ’t Hooft continued to collaborate on Yang–Mills theory. Their 1972 papers are among the finest that have been written on the subject. In 1999 they shared the Nobel Prize in Physics “for elucidating the quantum structure of electroweak interactions”.

With the renormalisability of the electroweak theory established, precision comparisons with experiment were within reach, and Veltman started to work on these problems with postdocs and PhD students. One important tool was the so-called rho parameter (a ratio of the masses of the W and Z bosons and the weak mixing angle). Its experimental value was close to one, which showed that only models in which the Higgs field starts as a doublet are allowed. From the small deviations from one, it was possible to estimate the mass of the top quark, which was not yet discovered. Later, when CERN was planning to build the LEP collider, the emphasis changed to the calculation of one-loop corrections for various processes in e+e collisions. As a member of the CERN Scientific Policy Committee (SPC), Veltman strongly argued that LEP should operate at the highest possible energy, well above the W+W threshold, to study the electroweak theory with precision. The Standard Model has since passed all of these tests.

From his early days at CERN it was clear that Tini had a strong interest in confronting theoretical predictions with experimental results, and in the organisation needed to do so. To this end, he was one of a small group of colleagues in the Netherlands to push for a national institute for subatomic physics – Nikhef, which was founded in 1975. In 1981 Tini moved to the University of Michigan in Ann Arbor, returning to the Netherlands after his retirement in 1996.

Veltman made a lasting impact on the field of particle physics, and inspired many students. Until recently he followed what was happening in the field, regularly attending the September meetings of the SPC. Our community will miss his sharp reasoning and clear-eyed look at particle physics that are crucial for its development.

Connecting physics with society

Student analysing ATLAS collisions

Science and basic research are drivers of technologies and innovations, which in turn are key to solving global challenges such as climate change and energy. The United Nations has summarised these challenges in 17 “sustainable development goals”, but it is striking how little connection with science they include. Furthermore, as found by a UNESCO study in 2017, the interest of the younger generation in studying science, technology, engineering and mathematics is falling, despite jobs in these areas growing at a rate three times faster than in any other sector. Clearly, there is a gulf between scientists and non-scientists when it comes to the perception of the importance of fundamental research in their lives – to the detriment of us all.

Some in the community are resistant to communicate physics spin-offs because this is not our primary purpose

Try asking your neighbours, kids, family members or mayor of your city whether they know about the medical and other applications that come from particle physics, or the stream of highly qualified people trained at CERN who bring their skills to business and industry. While the majority of young people are attracted to physics by its mindboggling findings and intriguing open questions, our subject appeals even more when individuals find out about its usefulness outside academia. This was one of the key outcomes of a recent survey, Creating Ambassadors for Science in Society, organised by the International Particle Physics Outreach Group (IPPOG).

Do most “Cernois” even know about the numerous start-ups based on CERN technologies or the hundreds of technology disclosures from CERN, 31 of which came in 2019 alone? Or about the numerous success stories contained within the CERN impact brochure and the many resources of CERN’s knowledge-transfer group? Even though “impact” is gaining attention, anecdotally when I presented  these facts to my research colleagues they were not fully aware. Yet who else will be our ambassadors, if not us?

Some in the community are resistant to communicate physics spin-offs because this is not our primary purpose. Yet, millions of people who have lost their income as a result of COVID-19 are rather more concerned about where their next rent and food payments are coming from, than they are about the couplings of the Higgs boson. Reaching out to non-physicists is more important than ever, especially to those with an indifferent or even negative attitude to science. Differentiating audiences between students, general public and politicians is not relevant when addressing non-scientifically educated people. Strategic information should be proactively communicated to all stakeholders in society in a relatable way, via eye-opening, surprising and emotionally charged stories about the practical applications of curiosity-driven discoveries.

Barbora Bruant Gulejova

IPPOG has been working to provide such stories since 2017 – and there is no shortage of examples. Take the touchscreen technology first explored at CERN 40 years ago, or humanitarian satellite mapping carried out for almost 20 years by UNOSAT, which is hosted at CERN. Millions of patients are diagnosed daily thanks to tools like PET and MRI, while more recent medical developments include innovative radioisotopes from MEDICIS for precision medicine, the first 3D colour X-ray images, and novel cancer treatments  based on superconducting accelerator technology. In the environmental arena, recent CERN spin-offs include a global network of air-quality sensors and fibre-optic sensors for improved water and pesticide management, while CERN open-source software is used for digital preservation in libraries and its computing resources have been heavily deployed in fighting the pandemic.

Building trust

Credibility and trust in science can only be built by scientists themselves, while working hand in hand with professional communicators, but not relying only on them. Extracurricular activities, such as those offered by IPPOG, CERN, other institutions and individual initiatives, are crucial in changing the misperceptions of the public and bringing about fact-based decision-making to the young generation. Scientists should develop a proactive strategic approach and even consider becoming active in policy making, following the shining examples of those who helped realise the SESAME light source in the Middle East and the South East European International Institute for Sustainable Technologies.

Particle physics already inspires some of the brightest minds to enter science. But audiences never look at our subject with the same eyes once they’ve learned about its applications and science-for-peace initiatives.

Jack Steinberger 1921–2020

Jack Steinberger

Jack Steinberger, a giant of the field who witnessed and shaped the evolution of particle physics from its beginnings to the confirmation of the Standard Model, passed away on 12 December aged 99. Born in the Bavarian town of Bad Kissingen in 1921, his father was a cantor and religious teacher to the small Jewish community, and his mother gave English and French lessons to supplement the family income. In 1934, after new Nazi laws had excluded Jewish children from higher education, Jack’s parents applied for him and his brother to take part in a charitable scheme that saw 300 German refugee children transferred to the US. Jack found his home as a foster child, and was reunited with his parents and younger brother in 1938.

Jack studied chemistry at the University of Chicago until 1942, when he joined the army and was sent to the MIT radiation laboratory to work on radar bomb sights. He was assigned to the antenna group where his attention was brought to physics. After the war he returned to Chicago to embark on a career in theoretical physics. Under the guidance of Enrico Fermi, however, he switched to the experimental side of the field, conducting mountaintop investigations into cosmic rays. He was awarded a PhD in 1948. Fermi, who was probably Jack’s most influential physics teacher, described him as “direct, confident, without complication, he concentrated on physics, and that was enough”.

In 1949 Steinberger went to the Radiation Lab at the University of California at Berkeley, where he performed an experiment at the electron synchrotron that demonstrated the production of neutral pions and their decay to photon pairs. He stayed only one year in Berkeley, partly because he declined to sign the anti-communist loyalty oath, and moved on to Columbia University.

In the 1960s the construction of a high-energy, high-flux proton accelerator at Brookhaven opened the door to the study of weak interactions using neutrino-beam experiments. This marked the beginning of Jack’s interest in neutrino physics. Along with Mel Schwarz and Leon Lederman, he designed and built the experiment that established the difference between neutrinos associated with muons and those associated with electrons, for which they received the 1988 Nobel Prize in Physics.

He was a curious and imaginative physicist with an extraordinary rigour

Jack joined CERN in 1968, working on experiments at the Proton Synchrotron exploring CP violation in neutral kaons. In the 1970s, with the advent of new neutrino beams at the Super Proton Synchrotron, Jack became a founding member of the CERN–Dortmund–Heidelberg–Saclay (CDHS) collaboration. Running from 1976 to 1984, CDHS produced a string of important results using neutrino beams to probe the structure of the nucleon and the Standard Model in general. In particular, the collaboration confirmed the predicted variation of the structure function of the valence quarks with Q2 (nicknamed “scaling violations”), a milestone in the establishment of QCD.

When the Large Electron–Positron (LEP) collider was first proposed, a core group from CDHS joined physicists from other institutions to develop a detector for CERN’s new flagship collider. This initiative grew into the ALEPH experiment, and Jack, a curious and imaginative physicist with an extraordinary rigour, was the natural choice to become its first spokesperson in 1980, a position he held until 1990. From the outset, he stipulated that standard solutions should be adopted across the whole detector as far as possible. This led to the end-caps reflecting the design of the central detector, for example. Jack was also insistent that all technologies considered for the detector first had to be completely understood. As the LEP era got underway, this level of discipline was reflected in ALEPH’s results.

Next to physics, music formed an important part of Jack’s life. He organised gatherings of amateur, and occasionally professional, musicians at his house. These were usually marathons of Bach, starting in the late afternoon and continuing until the late evening. In his autobiography, Jack summarised: “I play the flute, unfortunately not very well, and have enjoyed tennis, mountaineering and sailing, passionately.”

Jack retired from CERN in 1986 and went on to become a professor at the Scuola Normale Superiore di Pisa. President Ronald Reagan awarded him the National Medal of Science in 1988. In 2001, on the occasion of his 80th birthday, the city of Bad Kissingen named its gymnasium in his honour. Jack continued his association with CERN throughout his 90s. He leaves his mark not just on particle physics but on all of us who had the opportunity to collaborate with him.

Accelerating talent at CERN

Natalia Magdalena Koziol

CERN enjoys a world-class reputation as a scientific laboratory, with the start-up of the Large Hadron Collider and the discovery of the Higgs boson propelling the organisation into the public spotlight. Less tangible and understood by the public, however, is that to achieve this level of success in cutting-edge research, you need the infrastructure and tools to perform it. CERN is an incredible hub for engineering and technology – hosting a vast complex of accelerators, detectors, experiments and computing infrastructure. Thus, CERN needs to attract candidates from across a wide spectrum of engineering and technical disciplines to fulfil its objectives.

CERN employs around 2600 staff members who design, build, operate, maintain and support an infrastructure used by a much larger worldwide community of physicists. Of these, only 3% are research physicists. The core hiring needs are for engineers, technicians and support staff in a wide variety of domains: mechanical, electrical, engineering, vacuum, cryogenics, civil engineering, radiation protection, radio­frequency, computing, software, hardware, data acquisition, materials science, health and safety… the list goes on. Furthermore, there are also competences needed in human resources, legal matters, communications, knowledge transfer, finance, firefighters, medical professionals and other support functions.

On the radar

CERN’s hiring challenge takes on even greater meaning when one considers the drive to attract students, graduates and professionals from across CERN’s 32 Member and Associate Member States. In what is already a competitive market, attracting people from a large multitude of disciplines to an organisation whose reputation revolves around particle physics can be a challenge. So how is this challenge tackled? CERN now has a well-established “employer brand”, developed in 2010 to promote its opportunities in an increasingly digitalised environment. The brand centres around factors that make working at CERN the rich experience that it is, namely challenge, purpose, imagination, collaboration, integrity and quality of life – underpinned by the slogan “Take part”. This serves as an identity to devise attractive campaigns through web content, video, online, social media and job-portal advertisements to promote CERN as an employer of choice to the audiences we seek to reach: from students to professionals, apprenticeships to PhDs, across all diversity dimensions. The intention is to put CERN “on the radar” of people who wouldn’t normally identify CERN as a possibility in their chosen career path.

CERN doesn’t just bring together people from a large scope of fields but unites people from all over the world

As no single channel exists that will allow targeting of, for example, a mechanical technician in all CERN Member States, creative and innovative approaches have to be utilised. The varying landscapes, cultural preferences and languages come into play, and this is compounded by the different job-seeking behaviours of students, graduates and experienced professionals through a constantly evolving ecosystem of channels and solutions. A widespread presence is key. The cornerstones are: an attractive careers website; professional networks such as LinkedIn to promote CERN’s employment opportunities and proactively search for candidates; social media to increase visibility of hiring campaigns; and being present on various job portals, for example in the oil, gas and energy arenas. Outreach events, presence at university career fairs and online webinars further serve to present CERN and its diverse opportunities to the targeted audiences.

Storytelling is an essential ingredient in promoting our opportunities, as are the experiences of those already working at CERN. In the words of Håvard, an electromechanical technician from Norway: “I get to challenge myself in areas and with technology you don’t see any other place in the world.” Gunnar, a firefighter from Germany describes, “I am working as a firefighter in one of the most international fire brigades at CERN in what is a very complex, challenging and interesting environment.” Katarina, a computing engineer from Sweden, says, “The diversity of skills needed at CERN is so much larger than what most people know!” While Julia, a former mechanical engineering technical student from the UK put it simply: “I never knew that CERN recruited students for internships.” Natasha, a former software engineering fellow from Pakistan, summed it up with, “Here I am, living my dreams, being a part of an organisation that’s helping me grow every single day.” Each individual experience is a rich insight for potential candidates to identify with and recognise the possibility of joining CERN in their own right.

CERN doesn’t just bring together people from a large scope of fields but unites people from all over the world. Working as summer, technical or doctoral student, as a graduate or professional, builds skills and knowledge that are highly transferable in today’s demanding and competitive job market, along with lasting connections. As the cherry on the cake, a job at CERN paves the way to become CERN’s future alumni and join the ever-growing High-Energy Network. Take part!

bright-rec iop pub iop-science physcis connect